Refine
Document Type
- Conference Proceeding (12)
- Doctoral Thesis (1)
Conference Type
- Konferenzartikel (10)
- Konferenz-Abstract (1)
- Konferenz-Poster (1)
- Sonstiges (1)
Language
- English (13) (remove)
Has Fulltext
- no (13)
Is part of the Bibliography
- yes (13)
Keywords
- Interaction metaphor (2)
- Licht (2)
- Virtual Reality (2)
- 3D interaction (1)
- Collision Avoidance (1)
- Collision avoidance (1)
- Erweiterte Realität <Informatik> (1)
- Flüssigkristall (1)
- Gestaltung (1)
- Kryptographie (1)
Institute
Open Access
- Closed Access (7)
- Open Access (6)
Theoretical details about optics and photonics are not common knowledge nowadays. Physicists are keen to scientifically explain ‘light,’ which has a huge impact on our lives. It is necessary to examine it from multiple perspectives and to make the knowledge accessible to the public in an interdisciplinary, scientifically well-grounded and appealing medial way. To allow an information exchange on a global scale, our project “Invisible Light” establishes a worldwide accessible platform. Its contents will not be created by a single instance, but user-generated, with the help of the global community. The article describes the infotainment portal “Invisible Light,” which stores scientific articles about light and photonics and makes them accessible worldwide. All articles are tagged with geo-coordinates, so they can be clearly identified and localized. A smartphone application is used for visualization, transmitting the information to users in real time by means of an augmented reality application. Scientific information is made accessible for a broad audience and in an attractive manner.
In many scientific studies lens experiments are part of the curriculum. The conducted experiments are meant to give the students a basic understanding for the laws of optics and its applications. Most of the experiments need special hardware like e.g. an optical bench, light sources, apertures and different lens types. Therefore it is not possible for the students to conduct any of the experiments outside of the university’s laboratory. Simple optical software simulators enabling the students to virtually perform lens experiments already exist, but are mostly desktop or web browser based.
Augmented Reality (AR) is a special case of mediated and mixed reality concepts, where computers are used to add, subtract or modify one’s perception of reality. As a result of the success and widespread availability of handheld mobile devices, like e.g. tablet computers and smartphones, mobile augmented reality applications are easy to use. Augmented reality can be easily used to visualize a simulated optical bench. The students can interactively modify properties like e.g. lens type, lens curvature, lens diameter, lens refractive index and the positions of the instruments in space. Light rays can be visualized and promote an additional understanding of the laws of optics. An AR application like this is ideally suited to prepare the actual laboratory sessions and/or recap the teaching content.
The authors will present their experience with handheld augmented reality applications and their possibilities for light and optic experiments without the needs for specialized optical hardware.
Monitors are in the center of media productions and hold an important function as the main visual interface. Tablets and smartphones are becoming more and more important work tools in the media industry. As an extension to our lecture contents an intensive discussion of different display technologies and its applications is taking place now. The established LCD (Liquid Crystal Display) technology and the promising OLED (Organic Light Emitting Diode) technology are in the focus.
The classic LCD is currently the most important display technology. The paper will present how the students should develop sense for display technologies besides the theoretical scientific basics. The workshop focuses increasingly on the technical aspects of the display technology and has the goal of deepening the students understanding of the functionality by building simple Liquid Crystal Displays by themselves.
The authors will present their experience in the field of display technologies. A mixture of theoretical and practical lectures has the goal of a deeper understanding in the field of digital color representation and display technologies. The design and development of a suitable learning environment with the required infrastructure is crucial. The main focus of this paper is on the hands-on optics workshop “Liquid Crystal Display in the do-it-yourself”.
Our university carries out various research projects. Among others, the project Schluckspecht is an interdisciplinary work on different ultra-efficient car concepts for international contests. Besides the engineering work, one part of the project deals with real-time data visualization. In order to increase the efficiency of the vehicle, an online monitoring of the runtime parameters is necessary. The driving parameters of the vehicle are transmitted to a processing station via a wireless network connection. We plan to use an augmented reality (AR) application to visualize different data on top of the view of the real car. By utilizing a mobile Android or iOS device a user can interactively view various real-time and statistical data. The car and its components are meant to be augmented by various additional information, whereby that information should appear at the correct position of the components. An engine e.g. could show the current rpm and consumption values. A battery on the other hand could show the current charge level. The goal of this paper is to evaluate different possible approaches, their suitability and to expand our application to other projects at our university.
Walking interfaces offer advantages in navigation of VE systems over other types of locomotion. However, VR helmets have the disadvantage that users cannot see their immediate surroundings. Our publication describes the prototypical implementation of a virtual environment (VE) system, capable of detecting possible obstacles using an RGB-D sensor. In order to warn users of potential collisions with real objects while they are moving throughout the VE tracking area, we designed 4 different visual warning metaphors: Placeholder, Rubber Band, Color Indicator and Arrow. A small pilot study was carried out in which the participants had to solve a simple task and avoid any arbitrarily placed physical obstacles when crossing the virtual scene. Our results show that the Placeholder metaphor (in this case: trees), compared to the other variants, seems to be best suited for the correct estimation of the position of obstacles and in terms of the ability to evade them.
With this generation of devices, Virtual Reality (VR) has actually made it into the living rooms of end-users. These devices feature 6-DOF tracking, allowing them to move naturally in virtual worlds and experience them even more immersively. However, for a natural locomotion in the virtual, one needs a corresponding free space in the real environment. The available space is often limited, especially in everyday environments and under normal spatial conditions. Furnishings and objects of daily life can quickly become obstacles for VR users if they are not cleared away. Since the idea behind VR is to place users into a virtual world and to hide the real world as much as possible, invisible objects represent potential obstacles. The currently available systems offer only rudimentary assistance for this problem. If a user threatens to leave the space previously defined for use, a visual boundary is displayed to allow orientation within the space. These visual metaphors are intended to prevent users from leaving the safe area. However, there is no detection of potentially dangerous objects within this part of space. Objects that have not been cleared away or that have been added in the meantime may still become obstacles. This thesis shows how possible obstacles in the environment can be detected automatically with range imaging cameras and how users can be effectively warned about them in the virtual environment without significantly disturbing their sense of presence. Four different interactive visual metaphors are used to signalize the obstacles within the VE. With the help of a user study, the four signaling variants and the obstacle detection were evaluated and tested.