Refine
Document Type
- Conference Proceeding (4)
- Article (reviewed) (1)
- Book (1)
- Article (unreviewed) (1)
Conference Type
- Konferenzartikel (4)
Is part of the Bibliography
- yes (7)
Keywords
- Human Computer Interaction (7) (remove)
Institute
- Fakultät Medien und Informationswesen (M+I) (bis 21.04.2021) (7) (remove)
Open Access
- Closed Access (6)
- Open Access (1)
Social robots not only work with humans in collaborative workspaces – we meet them in shopping malls and even more personal settings like health and care. Does this imply they should become more human, able to interpret and adequately respond to human emotions? Do we want them to help elderly persons? Do we want them to support us when we are old ourselves? Do we want them to just clean and keep things orderly – or would we accept them helping us to go to the toilet, or even feed us if we suffer from Parkinson’s disease?
The answers to these questions differ from person to person. They depend on cultural background, personal experiences – but probably most of all on the robot in question. This book covers the phenomenon of social robots from the historic roots to today’s best practices and future perspectives. To achieve this, we used a hands-on, interdisciplinary approach, incorporating findings from computer scientists, engineers, designers, psychologists, doctors, nurses, historians and many more. The book also covers a vast spectrum of applications, from collaborative industrial work over education to sales. Especially for developments with a high societal impact like robots in health and care settings, the authors discuss not only technology, design and usage but also ethical aspects.
Thus this book creates both a compendium and a guideline, helping to navigate the design space for future developments in social robotics.
What emotional effects does gamification have on users who work or learn with repetitive tasks? In this work, we use biosignals to analyze these affective effects of gamification. After a brief discussion of related work, we describe the implementation of an assistive system augmenting work by projecting elements for guidance and gamification. We also show how this system can be extended to analyse users' emotions. In a user study, we analyse both biosignals (facial expressions and electrodermal activity), and regular performance measures (error rate and task completion time).
For the performance measures, the results confirm known effects like increased speed and slightly increased error rate. In addition, the analysis of the biosignals provides strong evidence for two major affective effects: the gamification of work and learning tasks incites highly significantly more positive emotions and increases emotionality altogether. The results add to the design of assistive systems, which are aware of the physical as well as the affective context.
Soziale Roboter unterscheiden sich von Servicerobotern, da sie auch komplexere Interaktionen und Kommunikation beherrschen. Einige können Emotionen simulieren oder sogar erkennen. Einsatzbereiche gibt es viele: vom Haushalt über die Pflege bis in den medizinischen Bereich. Wo liegen die Grenzen der aktuellen Systeme? Wie müssen soziale Roboter aussehen und interagieren, um als nützliche Helfer statt als Konkurrenten wahrgenommen zu werden? Dieser Artikel gibt einen kurzen Überblick bestehender sozialer Roboter. Er beleuchtet deren Akzeptanz im wichtigen Bereich Gesundheit und Pflege anhand der Ergebnisse einer Expertenstudie und gibt eine zeitliche Perspektive zur weiteren Entwicklung.
This paper describes a comparative study of two tactile systems supporting navigation for persons with little or no visual and auditory perception. The efficacy of a tactile head-mounted device (HMD) was compared to that of a wearable device, a tactile belt. A study with twenty participants showed that the participants took significantly less time to complete a course when navigating with the HMD, as compared to the belt.
Tactile Navigation with Checkpoints as Progress Indicators?: Only when Walking Longer Straight Paths
(2020)
Persons with both vision and hearing impairments have to rely primarily on tactile feedback, which is frequently used in assistive devices. We explore the use of checkpoints as a way to give them feedback during navigation tasks. Particularly, we investigate how checkpoints can impact performance and user experience. We hypothesized that individuals receiving checkpoint feedback would take less time and perceive the navigation experience as superior to those who did not receive such feedback. Our contribution is two-fold: a detailed report on the implementation of a smart wearable with tactile feedback (1), and a user study analyzing its effects (2). The results show that in contrast to our assumptions, individuals took considerably more time to complete routes with checkpoints. Also, they perceived navigating with checkpoints as inferior to navigating without checkpoints. While the quantitative data leave little room for doubt, the qualitative data open new aspects: when walking straight and not being "overwhelmed" by various forms of feedback in succession, several participants actually appreciated the checkpoint feedback.
The findings presented in this article were obtained through a preliminary exploratory study conducted at the Offenburg University as part of the Fighting Loneliness project promoted by the institution’s Affective & Cognitive Institute (ACI) from October 2019 to February 2020. The initiative’s main objective was to answer the research question “How should an app be designed to reduce loneliness and social isolation among university students?” with the collaboration of the institution’s students.
Deafblindness, a form of dual sensory impairment, signifcantly impacts communication, access to information and mobility. Inde- pendent navigation and wayfnding are main challenges faced by individuals living with combined hearing and visual impairments. We developed a haptic wearable that provides sensory substitution and navigational cues for users with deafblindness by conveying vibrotactile signals onto the body. Vibrotactile signals on the waist area convey directional and proximity information collected via a fisheye camera attached to the garment, while semantic informa- tion is provided with a tapping system on the shoulders. A playful scenario called “Keep Your Distance” was designed to test the navigation system: individuals with deafblindness were “secret agents” that needed to follow a “suspect”, but they should keep an opti- mal distance of 1.5 meters from the other person to win the game. Preliminary fndings suggest that individuals with deafblindness enjoyed the experience and were generally able to follow the directional cues.