Refine
Document Type
Conference Type
- Konferenzartikel (11)
- Sonstiges (2)
- Konferenz-Abstract (1)
- Konferenz-Poster (1)
Has Fulltext
- no (13)
Is part of the Bibliography
- yes (13)
Keywords
- Licht (3)
- 3D interaction (1)
- Collision avoidance (1)
- Erweiterte Realität <Informatik> (1)
- Flüssigkristall (1)
- Gestaltung (1)
- Interaction metaphor (1)
- Kryptographie (1)
- Leap Motion Controller (1)
- Lehre (1)
Institute
Open Access
- Open Access (7)
- Closed Access (6)
The International Year of Light and Light-Based Technologies 2015 (IYL 2015) was celebrated around the world. Worldwide activities were organized to highlight the impact of optics and photonics on life, science, economics, arts and culture, and also in education. With most of our activities at Offenburg University of Applied Sciences (Offenburg/Germany), we reached our own students and the general population of our region: - University for Children: “The Magic of Light“ winter lecture program and “Across the Universe with Relativity and Quantum Theory” summer lecture program - “Students Meet Scientists” - “A Century of General Relativity Theory” lecture program Nevertheless, with some of our activities we also engaged a worldwide audience: - IYL 2015 art poster collection (Magic of Light and No Football, Just Photonics) - Smart Interactive Projection - Twitter Wall - “Invisible Light” - Live broadcasting of the total lunar eclipse - Film Festival Merida Mexico The authors will highlight recent activities at our university dedicated to promote, celebrate, and create a legacy for the IYL 2015.
Our university carries out various research projects. Among others, the project Schluckspecht is an interdisciplinary work on different ultra-efficient car concepts for international contests. Besides the engineering work, one part of the project deals with real-time data visualization. In order to increase the efficiency of the vehicle, an online monitoring of the runtime parameters is necessary. The driving parameters of the vehicle are transmitted to a processing station via a wireless network connection. We plan to use an augmented reality (AR) application to visualize different data on top of the view of the real car. By utilizing a mobile Android or iOS device a user can interactively view various real-time and statistical data. The car and its components are meant to be augmented by various additional information, whereby that information should appear at the correct position of the components. An engine e.g. could show the current rpm and consumption values. A battery on the other hand could show the current charge level. The goal of this paper is to evaluate different possible approaches, their suitability and to expand our application to other projects at our university.
Practical exercises are a crucial part of many curricula. Even simple exercises can improve the understanding of the underlying subject. Most experimental setups require special hardware. To carry out e. g. a lens experiments the students need access to an optical bench, various lenses, light sources, apertures and a screen. In our previous publication we demonstrated the use of augmented reality visualization techniques in order to let the students prepare with a simulated experimental setup. Within the context of our intended blended learning concept we want to utilize augmented or virtual reality techniques for stationary laboratory exercises. Unlike applications running on mobile devices, stationary setups can be extended more easily with additional interfaces and thus allow for more complex interactions and simulations in virtual reality (VR) and augmented reality (AR). The most significant difference is the possibility to allow interactions beyond touching a screen. The LEAP Motion controller is a small inexpensive device that allows for the tracking of the user’s hands and fingers in three dimensions. It is conceivable to allow the user to interact with the simulation’s virtual elements by the user’s very hand position, movement and gesture. In this paper we evaluate possible applications of the LEAP Motion controller for simulated experiments in augmented and virtual reality. We pay particular attention to the devices strengths and weaknesses and want to point out useful and less useful application scenarios. © (2016) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
In many scientific studies lens experiments are part of the curriculum. The conducted experiments are meant to give the students a basic understanding for the laws of optics and its applications. Most of the experiments need special hardware like e.g. an optical bench, light sources, apertures and different lens types. Therefore it is not possible for the students to conduct any of the experiments outside of the university’s laboratory. Simple optical software simulators enabling the students to virtually perform lens experiments already exist, but are mostly desktop or web browser based.
Augmented Reality (AR) is a special case of mediated and mixed reality concepts, where computers are used to add, subtract or modify one’s perception of reality. As a result of the success and widespread availability of handheld mobile devices, like e.g. tablet computers and smartphones, mobile augmented reality applications are easy to use. Augmented reality can be easily used to visualize a simulated optical bench. The students can interactively modify properties like e.g. lens type, lens curvature, lens diameter, lens refractive index and the positions of the instruments in space. Light rays can be visualized and promote an additional understanding of the laws of optics. An AR application like this is ideally suited to prepare the actual laboratory sessions and/or recap the teaching content.
The authors will present their experience with handheld augmented reality applications and their possibilities for light and optic experiments without the needs for specialized optical hardware.
Theoretical details about optics and photonics are not common knowledge nowadays. Physicists are keen to scientifically explain ‘light,’ which has a huge impact on our lives. It is necessary to examine it from multiple perspectives and to make the knowledge accessible to the public in an interdisciplinary, scientifically well-grounded and appealing medial way. To allow an information exchange on a global scale, our project “Invisible Light” establishes a worldwide accessible platform. Its contents will not be created by a single instance, but user-generated, with the help of the global community. The article describes the infotainment portal “Invisible Light,” which stores scientific articles about light and photonics and makes them accessible worldwide. All articles are tagged with geo-coordinates, so they can be clearly identified and localized. A smartphone application is used for visualization, transmitting the information to users in real time by means of an augmented reality application. Scientific information is made accessible for a broad audience and in an attractive manner.
Walking interfaces offer advantages in navigation of VE systems over other types of locomotion. However, VR helmets have the disadvantage that users cannot see their immediate surroundings. Our publication describes the prototypical implementation of a virtual environment (VE) system, capable of detecting possible obstacles using an RGB-D sensor. In order to warn users of potential collisions with real objects while they are moving throughout the VE tracking area, we designed 4 different visual warning metaphors: Placeholder, Rubber Band, Color Indicator and Arrow. A small pilot study was carried out in which the participants had to solve a simple task and avoid any arbitrarily placed physical obstacles when crossing the virtual scene. Our results show that the Placeholder metaphor (in this case: trees), compared to the other variants, seems to be best suited for the correct estimation of the position of obstacles and in terms of the ability to evade them.