Refine
Year of publication
Document Type
- Conference Proceeding (39)
- Part of a Book (12)
- Article (reviewed) (4)
- Article (unreviewed) (3)
- Report (1)
Conference Type
- Konferenzartikel (39)
Has Fulltext
- no (59) (remove)
Is part of the Bibliography
- yes (59)
Keywords
- Gamification (15)
- Assistive Technology (9)
- Deafblindness (6)
- Human Computer Interaction (6)
- Roboter (5)
- Wearables (5)
- Games (4)
- Human Resources (4)
- Soziale Roboter (4)
- Affective Computing (3)
Institute
Open Access
- Closed Access (39)
- Open Access (12)
- Closed (5)
- Diamond (1)
- Gold (1)
Soziale Roboter, die mit uns kommunizieren und menschliche Verhaltensmuster imitieren, sind ein wichtiges Zukunftsthema. Während viele Arbeiten ihr Design und ihre Akzeptanz erforschen, gibt es bislang nur wenige Untersuchungen zu ihrer Marktfähigkeit. Der Schwerpunkt dieser Arbeit liegt auf dem Einsatz sozialer Roboter in den Bereichen Gesundheit und Pflege, wo die zukünftige Integration sozialer Roboter ein enormes Potenzial hat. Eine Studie mit 197 Personen aus Italien und Deutschland untersucht gewünschte Funktionalitäten und Kaufpräferenzen und berücksichtigt hierbei kulturelle Unterschiede. Dabei bestätigte sich die Wichtigkeit mehrerer Dimensionen des ALMERE-Modells (z. B. wahrgenommene Freude, Nützlichkeit und Vertrauenswürdigkeit). Die Akzeptanz korreliert stark mit der Investitionsbereitschaft. Viele ältere Personen betrachten soziale Roboter als „assistierende technische Geräte“ und erwarten, dass diese von Versicherungen und der öffentlichen Hand bezuschusst werden. Um ihren zukünftigen Einsatz zu erleichtern, sollten soziale Roboter in die Datenbanken medizinischer Hilfsmittel integriert werden.
The transition from college to university can have a variety of psychological effects on students who need to cope with daily obligations by themselves in a new setting, which can result in loneliness and social isolation. Mobile technology, specifically mental health apps (MHapps), have been seen as promising solutions to assist university students who are facing these problems, however, there is little evidence around this topic. My research investigates how a mobile app can be designed to reduce social isolation and loneliness among university students. The Noneliness app is being developed to this end; it aims to create social opportunities through a quest-based gamified system in a secure and collaborative network of local users. Initial evaluations with the target audience provided evidence on how an app should be designed for this purpose. These results are presented and how they helped me to plan the further steps to reach my research goals. The paper is presented at MobileHCI 2020 Doctoral Consortium.
Loneliness, an emotional distress caused by the lack of meaningful social connections, has been increasingly affecting university students who need to deal with everyday situations in a new setting, especially those who have come from abroad. Currently there is little work on digital solutions to reduce loneliness. Therefore, this work describes the general design considerations for mobile apps in this context and outlines a potential solution. The mobile app Noneliness is used to this end: it aims to reduce loneliness by creating social opportunities through a quest-based gamified system in a secure and collaborative network of local users. The results of initial evaluations with the target audience are described. The results informed a user interface redesign as well as a review of the features and the gamification principles adopted.
This paper describes a comparative study of two tactile systems supporting navigation for persons with little or no visual and auditory perception. The efficacy of a tactile head-mounted device (HMD) was compared to that of a wearable device, a tactile belt. A study with twenty participants showed that the participants took significantly less time to complete a course when navigating with the HMD, as compared to the belt.
A Gamified and Adaptive Learning System for Neurodivergent Workers in Electronic Assembling Tasks
(2020)
Learning and work-oriented assistive systems are often designed to fit the workflow of neurotypical workers. Neurodivergent workers and individuals with learning disabilities often present cognitive and sensorimotor characteristics that are better accommodated with personalized learning and working processes. Therefore, we designed an adaptive learning system that combines an augmented interaction space with user-sensitive virtual assistance to support step-by-step guidance for neurodivergent workers in electronic assembling tasks. Gamified learning elements were also included in the interface to provide self-motivation and praise whenever users progress in their learning and work achievements.
Wow, You Are Terrible at This!: An Intercultural Study on Virtual Agents Giving Mixed Feedback
(2020)
While the effects of virtual agents in terms of likeability, uncanniness, etc. are well explored, it is unclear how their appearance and the feedback they give affects people's reactions. Is critical feedback from an agent embodied as a mouse or a robot taken less serious than from a human agent? In an intercultural study with 120 participants from Germany and the US, participants had to find hidden objects in a game and received feedback on their performance by virtual agents with different appearances. As some levels were designed to be unsolvable, critical feedback was unavoidable. We hypothesized that feedback would be taken more serious, the more human the agent looked. Also, we expected the subjects from the US to react more sensitively to criticism. Surprisingly, our results showed that the agents' appearance did not significantly change the participants' perception. Also, while we found highly significant differences in inspirational and motivational effects as well as in perceived task load between the two cultures, the reactions to criticism were contrary to expectations based on established cultural models. This work improves our understanding on how affective virtual agents are to be designed, both with respect to culture and to dialogue strategies.
Tactile Navigation with Checkpoints as Progress Indicators?: Only when Walking Longer Straight Paths
(2020)
Persons with both vision and hearing impairments have to rely primarily on tactile feedback, which is frequently used in assistive devices. We explore the use of checkpoints as a way to give them feedback during navigation tasks. Particularly, we investigate how checkpoints can impact performance and user experience. We hypothesized that individuals receiving checkpoint feedback would take less time and perceive the navigation experience as superior to those who did not receive such feedback. Our contribution is two-fold: a detailed report on the implementation of a smart wearable with tactile feedback (1), and a user study analyzing its effects (2). The results show that in contrast to our assumptions, individuals took considerably more time to complete routes with checkpoints. Also, they perceived navigating with checkpoints as inferior to navigating without checkpoints. While the quantitative data leave little room for doubt, the qualitative data open new aspects: when walking straight and not being "overwhelmed" by various forms of feedback in succession, several participants actually appreciated the checkpoint feedback.
Deafblindness, also known as dual sensory loss, is the combination of sight and hearing impairments of such extent that it becomes difficult for one sense to compensate for the other. Communication issues are a key concern for the Deafblind community. We present the design and technical implementation of the Tactile Board: a mobile Augmentative and Alternative Communication (AAC) device for individuals with deafblindness. The Tactile Board allows text and speech to be translated into vibrotactile signs that are displayed real-time to the user via a haptic wearable. Our aim is to facilitate communication for the deafblind community, creating opportunities for these individuals to initiate and engage in social interactions with other people without the direct need of an intervener.
Deafblindness, a form of dual sensory impairment, signifcantly impacts communication, access to information and mobility. Inde- pendent navigation and wayfnding are main challenges faced by individuals living with combined hearing and visual impairments. We developed a haptic wearable that provides sensory substitution and navigational cues for users with deafblindness by conveying vibrotactile signals onto the body. Vibrotactile signals on the waist area convey directional and proximity information collected via a fisheye camera attached to the garment, while semantic informa- tion is provided with a tapping system on the shoulders. A playful scenario called “Keep Your Distance” was designed to test the navigation system: individuals with deafblindness were “secret agents” that needed to follow a “suspect”, but they should keep an opti- mal distance of 1.5 meters from the other person to win the game. Preliminary fndings suggest that individuals with deafblindness enjoyed the experience and were generally able to follow the directional cues.
Interaction and capturing information from the surrounding is dominated by vision and hearing. Haptics on the other side, widens the bandwidth and could also replace senses (sense switching) for impaired. Haptic technologies are often limited to point-wise actuation. Here, we show that actuation in two-dimensional matrices instead creates a richer input. We describe the construction of a full-body garment for haptic communication with a distributed actuating network. The garment is divided into attachable-detachable panels or add-ons that each can carry a two dimensional matrix of actuating haptic elements. Each panel adds to an enhanced sensoric capability of the human- garment system so that together a 720° system is formed. The spatial separation of the panels on different body locations supports semantic and theme-wise separation of conversations conveyed by haptics. It also achieves directional faithfulness, which is maintaining any directional information about a distal stimulus in the haptic input.