Refine
Document Type
- Conference Proceeding (13)
- Doctoral Thesis (1)
- Master's Thesis (1)
- Article (unreviewed) (1)
Conference Type
- Konferenzartikel (11)
- Sonstiges (2)
- Konferenz-Abstract (1)
- Konferenz-Poster (1)
Keywords
- Licht (3)
- Erweiterte Realität <Informatik> (2)
- Interaction metaphor (2)
- Virtual Reality (2)
- Virtuelle Realität (2)
- 3D interaction (1)
- AR SDK (1)
- Collision Avoidance (1)
- Collision avoidance (1)
- Flüssigkristall (1)
Institute
Open Access
- Closed Access (8)
- Open Access (7)
The International Year of Light and Light-Based Technologies 2015 (IYL 2015) was celebrated around the world. Worldwide activities were organized to highlight the impact of optics and photonics on life, science, economics, arts and culture, and also in education. With most of our activities at Offenburg University of Applied Sciences (Offenburg/Germany), we reached our own students and the general population of our region: - University for Children: “The Magic of Light“ winter lecture program and “Across the Universe with Relativity and Quantum Theory” summer lecture program - “Students Meet Scientists” - “A Century of General Relativity Theory” lecture program Nevertheless, with some of our activities we also engaged a worldwide audience: - IYL 2015 art poster collection (Magic of Light and No Football, Just Photonics) - Smart Interactive Projection - Twitter Wall - “Invisible Light” - Live broadcasting of the total lunar eclipse - Film Festival Merida Mexico The authors will highlight recent activities at our university dedicated to promote, celebrate, and create a legacy for the IYL 2015.
Our university carries out various research projects. Among others, the project Schluckspecht is an interdisciplinary work on different ultra-efficient car concepts for international contests. Besides the engineering work, one part of the project deals with real-time data visualization. In order to increase the efficiency of the vehicle, an online monitoring of the runtime parameters is necessary. The driving parameters of the vehicle are transmitted to a processing station via a wireless network connection. We plan to use an augmented reality (AR) application to visualize different data on top of the view of the real car. By utilizing a mobile Android or iOS device a user can interactively view various real-time and statistical data. The car and its components are meant to be augmented by various additional information, whereby that information should appear at the correct position of the components. An engine e.g. could show the current rpm and consumption values. A battery on the other hand could show the current charge level. The goal of this paper is to evaluate different possible approaches, their suitability and to expand our application to other projects at our university.
With this generation of devices, Virtual Reality (VR) has actually made it into the living rooms of end-users. These devices feature 6-DOF tracking, allowing them to move naturally in virtual worlds and experience them even more immersively. However, for a natural locomotion in the virtual, one needs a corresponding free space in the real environment. The available space is often limited, especially in everyday environments and under normal spatial conditions. Furnishings and objects of daily life can quickly become obstacles for VR users if they are not cleared away. Since the idea behind VR is to place users into a virtual world and to hide the real world as much as possible, invisible objects represent potential obstacles. The currently available systems offer only rudimentary assistance for this problem. If a user threatens to leave the space previously defined for use, a visual boundary is displayed to allow orientation within the space. These visual metaphors are intended to prevent users from leaving the safe area. However, there is no detection of potentially dangerous objects within this part of space. Objects that have not been cleared away or that have been added in the meantime may still become obstacles. This thesis shows how possible obstacles in the environment can be detected automatically with range imaging cameras and how users can be effectively warned about them in the virtual environment without significantly disturbing their sense of presence. Four different interactive visual metaphors are used to signalize the obstacles within the VE. With the help of a user study, the four signaling variants and the obstacle detection were evaluated and tested.
Practical exercises are a crucial part of many curricula. Even simple exercises can improve the understanding of the underlying subject. Most experimental setups require special hardware. To carry out e. g. a lens experiments the students need access to an optical bench, various lenses, light sources, apertures and a screen. In our previous publication we demonstrated the use of augmented reality visualization techniques in order to let the students prepare with a simulated experimental setup. Within the context of our intended blended learning concept we want to utilize augmented or virtual reality techniques for stationary laboratory exercises. Unlike applications running on mobile devices, stationary setups can be extended more easily with additional interfaces and thus allow for more complex interactions and simulations in virtual reality (VR) and augmented reality (AR). The most significant difference is the possibility to allow interactions beyond touching a screen. The LEAP Motion controller is a small inexpensive device that allows for the tracking of the user’s hands and fingers in three dimensions. It is conceivable to allow the user to interact with the simulation’s virtual elements by the user’s very hand position, movement and gesture. In this paper we evaluate possible applications of the LEAP Motion controller for simulated experiments in augmented and virtual reality. We pay particular attention to the devices strengths and weaknesses and want to point out useful and less useful application scenarios. © (2016) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
In many scientific studies lens experiments are part of the curriculum. The conducted experiments are meant to give the students a basic understanding for the laws of optics and its applications. Most of the experiments need special hardware like e.g. an optical bench, light sources, apertures and different lens types. Therefore it is not possible for the students to conduct any of the experiments outside of the university’s laboratory. Simple optical software simulators enabling the students to virtually perform lens experiments already exist, but are mostly desktop or web browser based.
Augmented Reality (AR) is a special case of mediated and mixed reality concepts, where computers are used to add, subtract or modify one’s perception of reality. As a result of the success and widespread availability of handheld mobile devices, like e.g. tablet computers and smartphones, mobile augmented reality applications are easy to use. Augmented reality can be easily used to visualize a simulated optical bench. The students can interactively modify properties like e.g. lens type, lens curvature, lens diameter, lens refractive index and the positions of the instruments in space. Light rays can be visualized and promote an additional understanding of the laws of optics. An AR application like this is ideally suited to prepare the actual laboratory sessions and/or recap the teaching content.
The authors will present their experience with handheld augmented reality applications and their possibilities for light and optic experiments without the needs for specialized optical hardware.
Theoretical details about optics and photonics are not common knowledge nowadays. Physicists are keen to scientifically explain ‘light,’ which has a huge impact on our lives. It is necessary to examine it from multiple perspectives and to make the knowledge accessible to the public in an interdisciplinary, scientifically well-grounded and appealing medial way. To allow an information exchange on a global scale, our project “Invisible Light” establishes a worldwide accessible platform. Its contents will not be created by a single instance, but user-generated, with the help of the global community. The article describes the infotainment portal “Invisible Light,” which stores scientific articles about light and photonics and makes them accessible worldwide. All articles are tagged with geo-coordinates, so they can be clearly identified and localized. A smartphone application is used for visualization, transmitting the information to users in real time by means of an augmented reality application. Scientific information is made accessible for a broad audience and in an attractive manner.