Refine
Year of publication
- 2016 (61) (remove)
Document Type
- Conference Proceeding (61) (remove)
Conference Type
- Konferenzartikel (48)
- Konferenz-Abstract (8)
- Sonstiges (5)
- Konferenzband (1)
Language
- English (61) (remove)
Keywords
- Eingebettetes System (4)
- Haustechnik (3)
- RoboCup (3)
- Design (2)
- Industrie 4.0 (2)
- Maschinenbau (2)
- Methanisierung (2)
- Simulation (2)
- 6LoWPAN (1)
- Arbeitstag (1)
Institute
- Fakultät Elektrotechnik und Informationstechnik (E+I) (bis 03/2019) (38)
- ivESK - Institut für verlässliche Embedded Systems und Kommunikationselektronik (17)
- Fakultät Medien und Informationswesen (M+I) (bis 21.04.2021) (9)
- Fakultät Maschinenbau und Verfahrenstechnik (M+V) (8)
- Fakultät Wirtschaft (W) (7)
- ACI - Affective and Cognitive Institute (3)
- INES - Institut für nachhaltige Energiesysteme (3)
- WLRI - Work-Life Robotics Institute (1)
Open Access
- Closed Access (31)
- Open Access (24)
- Bronze (4)
In this work, we investigate how gamification can be integrated into work processes in the automotive industry. The contribution contains five parts: (1) An introduction showing how gamification has become increasingly common, especially in education, health and the service industry. (2) An analysis on the state of the art of gamified applications, discussing several best practices. (3) An analysis of the special requirements for gamification in production, regarding both external norms and the mindset of workers in this domain. (4) An overview of first approaches towards a gamification of production, focusing on solutions for impaired workers in sheltered work organizations. (5) A study with a focus group of instructors at two large car manufacturers. Based on the presentation of three potential designs for the gamification of production, the study investigates the general acceptance of gamification in modern production and determines which design is best suited for future implementations.
The number of impaired persons rises -- as a result of both regular degradation with age and psychological problems like burnout. Sheltered work organizations aim to reintegrate impaired persons into work environments and prepare them for the re-entry in the regular job market.
Both for elderly and for impaired persons it is crucial to quickly assess the abilities, to identify limits and potentials and thus find work processes suitable for their skill profile.
This work focuses on the analysis and comparison of software-tools that assess the abilities of persons with impairments. We describe two established generic tools (CANTAB, Cogstate), analyze a yet unknown specialized tool (Hamet) and present a new gamified tool (GATRAS).
Finally, we present a study with 20 participants with impairments, comparing the tools against a ground truth baseline generated by a real-world assembly task.
The paper recommends an approach to estimate effectively the probability of buffer overflow in high-speed communication networks, capable of carrying diverse traffic, including self-similar teletraffic, and supporting diverse levels of quality of service. Simulations with stochastic, long-range dependent self-similar traffic source models are conducted. A new efficient algorithm, based on a variant of the RESTART/LRE method, is developed and applied to accelerate the buffer overflow simulation in a finite buffer single server model under long-range dependent self-similar traffic load with different buffer sizes. Numerical examples and simulation results are shown
Economic dispatch is a well-known optimization problem in smart grid systems which aims at minimizing the total cost of power generation among generation units while maintaining some system constraints. Recently, some distributed consensus-based approaches have been proposed to replace traditional centralized calculation. However, existing approaches fail to protect privacy of individual units like cost function parameters, generator constraints, output power levels, etc. In this paper, we show an attack against an existing consensus-based economic dispatch algorithm from [16] assuming semi-honest non-colluding adversaries. Then we propose a simple solution by combining a secure sum protocol and the consensus-based economic dispatch algorithm that guarantees data privacy under the same attacker model. Our Privacy Preserving Economic Dispatch (PPED) protocol is information-theoretically secure.
In the age data digitalization, important applications of optics and photonics based sensors and technology lie in the field of biometrics and image processing. Protecting user data in a safe and secure way is an essential task in this area. However, traditional cryptographic protocols rely heavily on computer aided computation. Secure protocols which rely only on human interactions are usually simpler to understand. In many scenarios development of such protocols are also important for ease of implementation and deployment. Visual cryptography (VC) is an encryption technique on images (or text) in which decryption is done by human visual system. In this technique, an image is encrypted into number of pieces (known as shares). When the printed shares are physically superimposed together, the image can be decrypted with human vision. Modern digital watermarking technologies can be combined with VC for image copyright protection where the shares can be watermarks (small identification) embedded in the image. Similarly, VC can be used for improving security of biometric authentication. This paper presents about design and implementation of a practical laboratory experiment based on the concept of VC for a course in media engineering. Specifically, our contribution deals with integration of VC in different schemes for applications like digital watermarking and biometric authentication in the field of optics and photonics. We describe theoretical concepts and propose our infrastructure for the experiment. Finally, we will evaluate the learning outcome of the experiment, performed by the students. © (2016) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
In a recent paper it has been shown that the effective nonlinear constant which is used in a P-Matrix approach to describe third-order intermodulation (IMD3) in surface acoustic wave (SAW) devices can be obtained from finite element (FEM) calculations of a periodic cell using nonlinear tensor data [1]. In this paper we extend this FEM calculation and show that the IMD3 of an infinite periodic array of electrodes on a piezoelectric substrate can be directly simulated in the sagittal plane. This direct approach opens the way for a FEM based simulation of nonlinearities for finite and generalized structures avoiding the simplifications of phenomenological approaches.
Wireless Sensor Networks (WSN) have emerged as interesting topic in the research community due to its manifold applications. One of the main challenges of this field is the energy consumption of the nodes, which typically is quite restricted due to the required lifetime of such WSNs. To solve that problem several energy-saving MAC protocols have been developed so far. One of them recently presented by the authors is the so-called SmartMAC as an extension to the IEEE802.15.4 standard. In this paper, we present the implementation details of the porting of the SmartMAC protocol to the discrete event network simulator NS3. We develop this module for NS3 to simulate the performance, multi node execution, and multi node configuration. Along with this model, we also present an energy model for the evaluation of the energy consumption. The current implementation in NS3 is based on the LR-WPAN (Low-Rate Wireless Personal Area Networks) as specified by the IEEE802.15.4 (2006) standard. The simulation results show that the SmartMAC with its sleep and wake-up mechanisms for the transceivers, is significantly more efficient than the current NS3 MAC (Medium Access Control) scheme.
Ultra wide band (UWB) signals are well suited both for short-range wireless communication and for high-precision localization applications. Channel impulse response (CIR) analysis in UWB systems is a major element in localization estimation. In this paper, practical aspects of CIR are presented. I.e. a technique for the construction of the accumulated echo-gram of a multipath delayed signal is proposed. Decawave hardware was used to demonstrate the technique of analysis of fine structure of signals with a sub-nanosecond resolution. Temporal stability, reliability and two-way characteristics of such echo-grams are discussed as well. The results of using two EVK1000 radio modules as a radar installation to detect a target in indoor environments prove that a low cost UWB intrusion detection and through-the-wall-vision systems might be developed using the proposed technique.
Wireless sensor networks have found their way into a wide range of applications among which environmental monitoring systems have attracted increasing interests of researchers. The main challenges for the applications are scalability of the network size and energy efficiency of the spatially distributed motes. These devices are mostly battery-powered and spend most of their energy budget on the radio transceiver module. A so-called Wake-On-Radio (WOR) technology can be used to achieve a reasonable balance among power consumption, range, complexity and response time. In this paper, a novel design for integration of WOR into IEEE802.1.5.4 is presented, which flexibly allows trade-offs in energy consumption between sender and receiver station, between real-time capability and energy consumption. For identical behavior, the proposed scheme is significantly more efficient than other schemes, which were proposed in recent publications, while preserving backward compatibility with standard IEEE802.15.4 transceivers.
The industry of the agave-derived bacanora, in the northern Mexican state of Sonora, has been growing substantially in recent years. However, this higher demand still lies under the influences of a variety of social, legal, cultural, ecological and economic elements. The governmental institutions of the state have tried to encourage a sustainable development and certain levels of standardization in the production of bacanora by applying different economical and legal strategies. However, a large portion of this alcoholic beverage is still produced in a traditional and rudimentary fashion. Beyond the quality of the beverage, the lack of proper control, by using adequate instrumental methods, might represent a health risk, as in several cases traditional-distilled beverages can contain elevated levels of harmful materials. The present article describes the qualitative spectral analysis of samples of the traditional-produced distilled beverage bacanora in the range from 0 cm−1 to 3500 cm−1 by using a Fourier Transform Raman spectrometer. This particular technique has not been previously explored for the analysis of bacanora, as in the case of other beverages, including tequila. The proposed instrumental arrangement for the spectral analysis has been built by combining conventional hardware parts (Michelson interferometer, photo-diodes, visible laser, etc.) and a set of self-developed evaluation algorithms. The resulting spectral information has been compared to those of pure samples of ethanol and to the spectra from different samples of the alcoholic beverage tequila. The proposed instrumental arrangement can be used the analysis of bacanora.
Components of rocket engines as actively cooled combustion chambers must withstand high pressure as well as severe and complex thermal transients. While the thermal transients result in temperature gradients and, thus, in constraint thermal strains, the pressure load induces mean stresses. To assess the mechanical behaviour of such components during design via finite-element calculations, constitutive models are necessary that describe the time- and temperature-dependent plasticity of the material appropriately.
Advanced models account for viscoplastic deformations including isotropic and kinematic hardening, recovery and ratcheting. However, the models contain a relatively large number of temperature-dependent material properties that must be determined on the basis of data of material tests. The determination of the properties is a non-trivial task because it is not clear which loading history must be applied in the tests for a certain material to obtain stable and robust (i.e. objective) material properties. Consequently, the determined properties are depending on the underlying loading history in the tests as well as on the experience and valuation of the person that determined the properties. This results in uncertainties during the assessment of the components that must be faced with conservative designs leading to negative consequences in terms of mass and costs.
It is the aim of this work funded by the European Space Agency ESA to derive a procedure to determine stable and robust material properties of an advanced viscoplastic constitutive model for aerospace materials. To this end, a special loading history is applied in isothermal material tests conducted with copper at different temperatures in the temperature range from 300 to 700 K. To determine the material properties and to assess stability and robustness methods for numerical optimization as well as analytical and statistical methods are used. The determined material properties are validated on the basis of results of thermomechanical material tests also conducted in the temperature range from 300 to 700 K.
We herein present a topology design method based on local optimality criteria which has been implemented in an open source Navier-Stokes solver for turbulent flows. Our method aims for the fast generation of geometry proposals in the early conceptual phase. To the best of our knowledge, this is the first local criteria approach utilizing a wall function turbulence model in order to consider turbulent flows. In order to allow for the growth as well as the shrinkage, or even the formation or disappearance of structural features, a topological approach is chosen. By introducing a volume fraction parameter, we distinguish between fluid and solid properties in each control volume. The fluid-solid interface is represented by an immersed boundary method using a piecewise linear surface reconstruction.
The humanoid Sweaty was the finalist in this year’s robocup soccer championship(adult size). For the optimization of the gait and the stability, data concerning forces and torques in the ankle joints would be helpful. In the following paper the development of a six-axis force and torque sensor for the humanoid robot Sweaty is described. Since commercial sensors do not meet the demands for the sensors in Sweatys ankle joints, a new sensor was developed. As a measuring devices we used strain gauges and custom electronics based on an acam PS09. The geometry was analyzed with the FEM program ANSYS to get optimal dimensions for the measuring beams. In addition ANSYS was used to optimize the position for the strain gauges on the beam.
This paper focuses on appropriately measuring the accuracy of forecasts of load behavior and renewable generation in micro-grid operation. Common accuracy measures like the root mean square of the error are often difficult to interpret for system design, as they describe the mean accuracy of the forecast. Micro-grid systems, however, have to be designed to handle also worst case situations. This paper therefore suggests two error measures that are based on the maximum function and that better allow understanding worst case requirements with respect to balancing power and balancing energy supply.
Transthoracic impedance cardiography (ICG) is a non-invasive method for determination of hemodynamic parameters. The basic principle of transthoracic ICG is the measurement of electrical conductivity of the thorax over the time. The aim of the study was the analysis of hemodynamic parameters from healthy individuals and the evaluation of various hemodynamic monitoring devices. Fourteen men (mean age 25 ± 4.59 years) and twelve women (mean age 24 ± 3.5 years) were measured during the cardiovascular engineering laboratory at Offenburg University of Applied Sciences, Offenburg, Germany. The ICG recordings were measured with the devices CardioScreen 1000, CardioScreen 2000 and TensoScreen with the corresponding Software Cardiovascular Lab 2.5 (Medis Medizinische Messtechnik GmbH, Illmenau, Germany). In order to create identical frame conditions, all measurements were recorded in the same position and for the same duration. Various positions were simulated from horizontal lying position to vertical standing position. Altogether, more than 30 hemodynamic parameters were measured.
In this work, we consider a duty-cycled wireless sensor network with the assumption that the on/off schedules are uncoordinated. In such networks, as all nodes may not be awake during the transmission of time synchronization messages, nodes will require to re-transmit the synchronization messages. Ideally a node should re-transmit for the maximum sleep duration to ensure that all nodes are synchronized. However, such a proposition will immensely increase the energy consumption of the nodes. Such a situation demands that there is an upper bound of the number of retransmissions. We refer to the time a node spends in re-transmission of the control message as broadcast duration. We ask the question, what should be the broadcast duration to ensure that a certain percentage of the available nodes are synchronized. The problem to estimate the broadcast duration is formulated so as to capture the probability threshold of the nodes being synchronized. Results show the proposed analytical model can predict the broadcast duration with a given lower error margin under real world conditions, thus demonstrating the efficiency of our solution.
Institute of Reliable Embedded Systems and Communication Electronics, Offenburg University of Applied Sciences, Germany has developed an automated testing environment, Automated Physical TestBeds (APTB), for analyzing the performance of wireless systems and its supporting protocols. Wireless physical networking nodes can connect to this APTB and the antenna output of this attaches with the RF waveguides. To model the RF environment this RF waveguides then establish wired connection among RF elements like splitters, attenuators and switches. In such kind of set up it’s well possible to vary the path characteristics by altering the attenuators and switches. The major advantage of using APTB is the possibility of isolated, well controlled, repeatable test environment in various conditions to run statistical analysis and even to execute regression tests. This paper provides an overview of the design and implementation of APTB, demonstrates its ability to automate test cases, and its efficiency.