Refine
Year of publication
Document Type
- Conference Proceeding (1084)
- Article (unreviewed) (558)
- Article (reviewed) (522)
- Part of a Book (454)
- Book (220)
- Other (138)
- Contribution to a Periodical (123)
- Patent (94)
- Report (62)
- Letter to Editor (30)
Conference Type
- Konferenzartikel (851)
- Konferenz-Abstract (153)
- Sonstiges (40)
- Konferenz-Poster (31)
- Konferenzband (13)
Language
- German (1729)
- English (1585)
- Other language (5)
- Russian (3)
- Multiple languages (2)
- French (1)
- Spanish (1)
Has Fulltext
- no (3326) (remove)
Is part of the Bibliography
- yes (3326) (remove)
Keywords
- Digitalisierung (39)
- RoboCup (32)
- Dünnschichtchromatographie (26)
- Arbeitszeugnis (22)
- Energieversorgung (21)
- Kommunikation (21)
- Finite-Elemente-Methode (19)
- Management (19)
- Industrie 4.0 (18)
- Machine Learning (18)
Institute
- Fakultät Maschinenbau und Verfahrenstechnik (M+V) (785)
- Fakultät Medien und Informationswesen (M+I) (bis 21.04.2021) (717)
- Fakultät Elektrotechnik und Informationstechnik (E+I) (bis 03/2019) (692)
- Fakultät Wirtschaft (W) (558)
- Fakultät Elektrotechnik, Medizintechnik und Informatik (EMI) (ab 04/2019) (390)
- INES - Institut für nachhaltige Energiesysteme (178)
- Fakultät Medien (M) (ab 22.04.2021) (173)
- ivESK - Institut für verlässliche Embedded Systems und Kommunikationselektronik (133)
- IMLA - Institute for Machine Learning and Analytics (72)
- ACI - Affective and Cognitive Institute (57)
Open Access
- Closed Access (1241)
- Open Access (863)
- Closed (521)
- Bronze (189)
- Diamond (53)
- Gold (11)
- Hybrid (11)
- Grün (7)
Implementation of lightweight design in the product development process of unmanned aerial vehicles
(2017)
The development and manufacturing of unmanned aerial vehicles (UAVs) require a multitude of design rules. Thereby, additive manufacturing (AM) processes provide a number of significant advantages over conventional production methods, particularly for implementing requirements with regard to lightweight construction and sustainability. A new, promising approach is presented, with which, through the combination of very light structural elements with a ribbed construction, an attached covering by means of foil is used. This contribution develops and presents a development process that is based on various development cycles. Such cycles differ in their effort and scope within the overall development, and may only comprise one part of the development process, or the entire development process. The applicability of this development process is demonstrated within the framework of a comprehensive case study. The aim is to develop an additively manufactured product that is as light as possible in the form of a UAV, along with a sustainable manufacturing process for such product. Finally, the results of this case study are analyzed with regard to the improvement of lightweight construction.
The aim of this data collection is to enforce evidence of SCS effectiveness in treating neuropathic chronic pain and the very low percentage of undesired side effects of complications reported in our case series suggests that all implants should be performed by similarly well-trained and experienced professionals.
Defining Recrutainment: A Model and a Survey on the Gamification of Recruiting and Human Resources
(2017)
Recrutainment, is a hybrid word combining recruiting and entertainment. It describes the combination of activities in human resources and gamification. Concepts and methods from game design are now used to assess and select future employees. Beyond this area, recrutainment is also applied for internal processes like professional development or even marketing campaigns. This paper’s contribution has four components: (1) we provide a conceptual background, leading to a more precise definition of recrutainment; (2) we develop a new model for analyzing solutions in recrutainment; (3) we present a corpus of 42 applications and use the new model to assess their strengths and potentials; (4) we provide a bird’s eye view on the state of the art in recrutainment and show the current weighting of gamification and recruiting aspects.
Applications helping us to maintain the focus on work are called “Zenware” (from concentration and Zen). While form factors, use cases and functionality vary, all these applications have a common goal: creating uninterrupted, focused attention on the task at hand. The rise of such tools exemplifies the users’ desire to control their attention within the context of omnipresent distraction. In expert interviews we investigate approaches in the context of attention-management at the workplace of knowledge workers. To gain a broad understanding, we use judgement sampling in interviews with experts from several disciplines. We especially explore how focus and flow can be stimulated. Our contribution has four components: a brief overview on the state of the art (1), a presentation of the results (2), strategies for coping with digital distractions and design guidelines for future Zenware (3) and an outlook on the overall potential in digital work environments (4).
This chapter portrays the historical and mathematical background of dynamic and procedural content generation (PCG). We portray and compare various PCG methods and analyze which mathematical approach is suited for typical applications in game design. In the next step, a structural overview of games applying PCG as well as types of PCG is presented. As abundant PCG content can be overwhelming, we discuss context-aware adaptation as a way to adapt the challenge to individual players’ requirements. Finally, we take a brief look at the future of PCG.
The M-Bus protocol (EN13757) is in widespread use for metering applications within home area and neighborhood area networks, but lacks a strict specification. This may lead to incompatibilities in real-life installations and to problems in the deployment of new M-Bus networks. This paper presents the development of a novel testbed to emulate physical Metering Bus (M-Bus) networks with different topologies and to allow the flexible verification of real M-Bus devices in real-world scenarios. The testbed is designed to support device manufacturers and service technicians in test and analysis of their devices within a specific network before their installation. The testbed is fully programmable, allowing flexible changes of network topologies, cable lengths and types. Itis easy to use, as only the master and the slaves devices have to be physically connected. This allows to autonomously perform multiple tests, including automated regression tests. The testbed is available to other researchers and developers. We invite companies and research institutions to use this M-Bus testbed to increase the common knowledge and real-world experience.
Phenolic compounds, such as flavonoids and phenolic acids, are very important substances that occur in various medicinal plants. They show different pharmacological activities which might be useful in the therapy of many diseases. Phenolic compounds have achieved an increasing interest over the last years because these compounds are easily oxidized and, thus, act as strong antioxidants. We present the chemiluminescence of different phenolic compounds measured directly on high-performance thin-layer chromatography LiChrospher® plates using the oxalic acid derivative bis(2,4,6-trichlorophenyl) oxalate (TCPO) in conjunction with H2O2. Our results indicate that chemiluminescence intensity increases with an ascending number of phenolic groups in the molecule. The method can be used to detect phenolic compounds in beverages like coffee, tea, and wine.
The number of impaired persons rises -- as a result of both regular degradation with age and psychological problems like burnout. Sheltered work organizations aim to reintegrate impaired persons into work environments and prepare them for the re-entry in the regular job market.
Both for elderly and for impaired persons it is crucial to quickly assess the abilities, to identify limits and potentials and thus find work processes suitable for their skill profile.
This work focuses on the analysis and comparison of software-tools that assess the abilities of persons with impairments. We describe two established generic tools (CANTAB, Cogstate), analyze a yet unknown specialized tool (Hamet) and present a new gamified tool (GATRAS).
Finally, we present a study with 20 participants with impairments, comparing the tools against a ground truth baseline generated by a real-world assembly task.
Drehzahlzustandsregelung
(2016)
In this work, we consider a duty-cycled wireless sensor network with the assumption that the on/off schedules are uncoordinated. In such networks, as all nodes may not be awake during the transmission of time synchronization messages, nodes will require to re-transmit the synchronization messages. Ideally a node should re-transmit for the maximum sleep duration to ensure that all nodes are synchronized. However, such a proposition will immensely increase the energy consumption of the nodes. Such a situation demands that there is an upper bound of the number of retransmissions. We refer to the time a node spends in re-transmission of the control message as broadcast duration. We ask the question, what should be the broadcast duration to ensure that a certain percentage of the available nodes are synchronized. The problem to estimate the broadcast duration is formulated so as to capture the probability threshold of the nodes being synchronized. Results show the proposed analytical model can predict the broadcast duration with a given lower error margin under real world conditions, thus demonstrating the efficiency of our solution.
IPv6 over resource-constrained devices (6Lo) emerged as a de-facto standard for the Internet of Things (IoT) applications especially in home and building automation systems. We provide results of an investigation of the applicability of 6LoWPAN with RPL mesh networks for home and building automation use cases. The proper selection of Trickle parameters and neighbor reachable time-outs is important in the RPL protocol suite to respond efficiently to any path failure. These parameters were analyzed in the context of energy consumption w.r.t the number of control packets. The measurements were performed in an Automated Physical Testbeds (APTB). The results match the recommendation by RFC 7733 for selecting various parameters of RPL protocol suite. This paper shows the relationship between various RPL parameters and control traffic overhead during network rebuild. Comparative measurement results with Bluetooth Low Energy (BLE) in this work showed that 6Lo with RPL outperformed BLE in this use case with less control traffic overheads.
Electric arc furnaces (EAF) are complex industrial plants whose actual behavior depends upon numerous factors. Due to its energy intensive operation, the EAF process has always been subject to optimization efforts. For these reasons, several models have been proposed in literature to analyze and predict different modes of operation. Most of these models focused on the processes inside the vessel itself. The present paper introduces a dynamic, physics-based model of a complete EAF plant which consists of the four subsystems vessel, electric system, electrode regulation, and off-gas system. Furthermore the solid phase is not treated to be homogenous but a simple spatial discretization is employed. Hence it is possible to simulate the energy input by electric arcs and fossil fuel burners depending on the state of the melting progress. The model is implemented in object-oriented, equation-based language Modelica. The simulation results are compared to literature data.
We present a videodensitometric quantification method for methadone in syrup, separated by thin-layer chromatography (TLC). The quantification is based on a derivation reaction with Dragendorf reagent. Measurements were carried out using a 16-bit flatbed scanner. The range of linearity covers two magnitudes of power using the Kubelka-Munk expression for data transformation. The separation method is inexpensive, fast, and reliable.
Battery degradation is a complex physicochemical process that strongly depends on operating conditions. We present a model-based analysis of lithium-ion battery degradation in a stationary photovoltaic battery system. We use a multi-scale multi-physics model of a graphite/lithium iron phosphate (LiFePO4, LFP) cell including solid electrolyte interphase (SEI) formation. The cell-level model is dynamically coupled to a system-level model consisting of photovoltaics (PV), inverter, load, grid interaction, and energy management system, fed with historic weather data. Simulations are carried out for two load scenarios, a single-family house and an office tract, over annual operation cycles with one-minute time resolution. As key result, we show that the charging process causes a peak in degradation rate due to electrochemical charge overpotentials. The main drivers for cell ageing are therefore not only a high state of charge (SOC), but the charging process leading towards high SOC. We also show that the load situation not only influences system parameters like self-sufficiency and self-consumption, but also has a significant impact on battery ageing. We assess reduced charge cut-off voltage as ageing mitigation strategy.
The DMFC is a promising option for backup power systems and for the power supply of portable devices. However, from the modeling point of view liquid-feed DMFC are challenging systems due to the complex electrochemistry, the inherent two-phase transport and the effect of methanol crossover. In this paper we present a physical 1D cell model to describe the relevant processes for DMFC performance ranging from electrochemistry on the surface of the catalyst up to transport on the cell level. A two-phase flow model is implemented describing the transport in gas diffusion layer and catalyst layer at the anode side. Electrochemistry is described by elementary steps for the reactions occurring at anode and cathode, including adsorbed intermediate species on the platinum and ruthenium surfaces. Furthermore, a detailed membrane model including methanol crossover is employed. The model is validated using polarization curves, methanol crossover measurements and impedance spectra. It permits to analyze both steady-state and transient behavior with a high level of predictive capabilities. Steady-state simulations are used to investigate the open circuit voltage as well as the overpotentials of anode, cathode and electrolyte. Finally, the transient behavior after current interruption is studied in detail.
In the last decade, IPv6 over Low power Wireless Personal Area Networks (IEEE802.15.4), also known as 6LoWPAN, has well evolved as a primary contender for short range wireless communications and holds the promise of an Internet of Things, which is completely based on the Internet Protocol. The authors' team has developed a 6LoWPAN protocol stack in C language, the stack without the necessity to use a specific design environment or operating system. It is highly flexible, modular, and portable and can be enhanced by several interesting modules, like a Wake-On-Radio-(WOR) MAC layer or a TLS1.2 based security sublayer. The stack is made available as open source at https://github.com/hso-esk/emb6. It was extensively tested on the Automated Physical Testbed (APTB) for Wireless Systems, which is available in the authors' lab and allows a flexible setup and full control of arbitrary topologies. The results of the measurements demonstrate a very good stability and short-term with long-term performance also under dynamic conditions.
Wireless communication systems more and more become part of our daily live. Especially with the Internet of Things (IoT) the overall connectivity increases rapidly since everyday objects become part of the global network. For this purpose several new wireless protocols have arisen, whereas 6LoWPAN (IPv6 over Low power Wireless Personal Area Networks) can be seen as one of the most important protocols within this sector. Originally designed on top of the IEEE802.15.4 standard it is a subject to various adaptions that will allow to use 6LoWPAN over different technologies; e.g. DECT Ultra Low Energy (ULE). Although this high connectivity offers a lot of new possibilities, there are several requirements and pitfalls coming along with such new systems. With an increasing number of connected devices the interoperability between different providers is one of the biggest challenges, which makes it necessary to verify the functionality and stability of the devices and the network. Therefore testing becomes one of the key components that decides on success or failure of such a system. Although there are several protocol implementations commonly available; e.g., for IoT based systems, there is still a lack of according tools and environments as well as for functional and conformance testing. This article describes the architecture and functioning of the proposed test framework based on Testing and Test Control Notation Version 3 (TTCN-3) for 6LoWPAN over ULE networks.
The aim of the smart grid is to achieve more efficient, distributed and secure supply of energy over the traditional power grid by using a bidirectional information flow between the grid agents (e.g. generator node, customer). One of the key optimization problems in smart grid is to produce power among generator nodes with a minimum cost while meeting the customer demand, known as Economic Dispatch Problem (EDP). In recent years, many distributed approaches to solve EDP have been proposed. However, protecting the privacy-sensitive data of individual generator nodes has been largely overlooked in the existing solutions. In this work, we show an attack against an existing auction-based EDP protocol considering a non-colluding semi-honest adversary. We briefly introduce our approach to a practical privacy-preserving EDP solution as our work in progress.
This book offers a compendium of best practices in game dynamics. It covers a wide range of dynamic game elements ranging from player behavior over artificial intelligence to procedural content generation. Such dynamics make virtual worlds more lively and realistic and they also create the potential for moments of amazement and surprise. In many cases, game dynamics are driven by a combination of random seeds, player records and procedural algorithms. Games can even incorporate the player’s real-world behavior to create dynamic responses. The best practices illustrate how dynamic elements improve the user experience and increase the replay value.
The book draws upon interdisciplinary approaches; researchers and practitioners from Game Studies, Computer Science, Human-Computer Interaction, Psychology and other disciplines will find this book to be an exceptional resource of both creative inspiration and hands-on process knowledge.
Gamifying rehabilitation is an efficient way to improve motivation and exercise frequency. However, between flow theory, self-determination theory or Bartle's player types there is much room for speculation regarding the mechanics required for successful gamification, which in turn leads to increased motivation. For our study, we selected a gamified solution for motion training (an exergame) where the playful design elements are extremely simple. The contribution is three-fold: we show best practices from the state of the art, present a study analyzing the effects of simple gamification mechanics on a quantitative and on a qualitative level and discuss strategies for playful design in therapeutic movement games.
Designing Authentic Emotions for Non-Human Characters. A Study Evaluating Virtual Affective Behavior
(2017)
While human emotions have been researched for decades, designing authentic emotional behavior for non-human characters has received less attention. However, virtual behavior not only affects game design, but also allows creating authentic avatars or robotic companions. After a discussion of methods to model and recognize emotions, we present three characters with a decreasing level of human features and describe how established design techniques can be adapted for such characters. In a study, 220 participants assess these characters' emotional behavior, focusing on the emotion "anger". We want to determine how reliable users can recognize emotional behavior, if characters increasingly do not look and behave like humans. A secondary aim is determining if gender has an impact on the competence in emotion recognition. The findings indicate that there is an area of insecure attribution of virtual affective behavior not distant but close to human behavior. We also found that at least for anger, men and women assess emotional behavior equally well.
This work demonstrates the potentials of procedural content generation (PCG) for games, focusing on the generation of specific graphic props (reefs) in an explorer game. We briefly portray the state-of-the-art of PCG and compare various methods to create random patterns at runtime. Taking a step towards the game industry, we describe an actual game production and provide a detailed pseudocode implementation showing how Perlin or Simplex noise can be used efficiently. In a comparative study, we investigate two alternative implementations of a decisive game prop: once created traditionally by artists and once generated by procedural algorithms. 41 test subjects played both implementations. The analysis shows that PCG can create a user experience that is significantly more realistic and at the same time perceived as more aesthetically pleasing. In addition, the ever-changing nature of the procedurally generated environments is preferred with high significance, especially by players aged 45 and above.
Gamification, die spielerische Anreicherung von Tätigkeiten, erfreut sich zunehmender Beliebtheit. Insbesondere in den Bereichen Gesundheit (Exergames) oder Lernen (Serious Games, Edutainment) gibt es eine Vielzahl erfolgreicher Anwendungen. Weniger verbreitet ist Gamification dagegen bislang bei Arbeitsprozessen. Zwar gibt es erfolgreiche Ansätze im Bereich Dienstleistung und Service (z. B. bei Callcentern), der Bereich der industriellen Produktion wurde jedoch bis vor wenigen Jahren nicht adressiert.
Dieses Kapitel gibt einen Überblick der Entwicklung von Gamification und zeigt den Stand der Technik. Wir leiten allgemeine Anforderungen für Gamification im Produktionsumfeld ab und stellen zwei neue Ansätze aus der aktuellen Forschung vor. Diese werden in einer Studie mit Trainern aus der Automobilbranche auf Akzeptanz untersucht. Die Ergebnisse zeigen eine insgesamt positive Haltung zur Gamifizierung der Produktion und eine sehr hohe Akzeptanz insbesondere des Pyramiden-Designs.
We present the design outline of a context-aware interactive system for smart learning in the STEM curriculum (science, technology, engineering, and mathematics). It is based on a gameful design approach and enables "playful coached learning" (PCL): a learning process enriched by gamification but also close to the learner's activities and emotional setting. After a brief introduction on related work, we describe the technological setup, the integration of projected visual feedback and the use of object and motion recognition to interpret the learner's actions. We explain how this combination enables rapid feedback and why this is particularly important for correct habit formation in practical skills training. In a second step, we discuss gamification methods and analyze which are best suited for the PCL system. Finally, emotion recognition, a major element of the final PCL design not yet implemented, is briefly outlined.
There is a growing trend for the use of thermo-active building systems (TABS) for the heating and cooling of buildings, because these systems are known to be very economical and efficient. However, their control is complicated due to the large thermal inertia, and their parameterization is time-consuming. With conventional TABS-control strategies, the required thermal comfort in buildings can often not be maintained, particularly if the internal heat sources are suddenly changed. This paper shows measurement results and evaluations of the operation of a novel adaptive and predictive calculation method, based on a multiple linear regression (AMLR) for the control of TABS. The measurement results are compared with the standard TABS strategy. The results show that the electrical pump energy could be reduced by more than 86%. Including the weather adjustment, it could be demonstrated that thermal energy savings of over 41% could be reached. In addition, the thermal comfort could be improved due to the possibility to specify mean room set-point temperatures. With the AMLR, comfort category I of the comfort norms ISO 7730 and DIN EN 15251 are observed in about 95% of occasions. With the standard TABS strategy, only about 24% are within category I.
Adaptive predictive control of thermo-active building systems (TABS) based on a multiple regression algorithm: First practical test. Available from: https://www.researchgate.net/publication/305903009_Adaptive_predictive_control_of_thermo-active_building_systems_TABS_based_on_a_multiple_regression_algorithm_First_practical_test [accessed Jul 7, 2017].
The famous violin virtuoso Nicolò Paganini (born on 27 October 1782 in Genoa, died on 27 May 1840 in Nice) left us with many puzzles. An interesting aspect is his hair: In the 19th century, hair given away as a token of friendship or romantic love became very popular, and Paganini also seems to have made use of this fad. In 2009, a lock of hair, purportedly that of Paganini, kept in a locked presentation box together with a bilingual autograph inscription saying: "Alla Signora Chatterton avec les compliments de Nicolò Paganini" was bought at an auction. From this hair lock a sample was taken and was investigated morphologically by using digital light microscopy (digital microscope VHX-100, Keyence) in reflected and transmitted light with and without polarization at different magnifications up to 1:5,000. The sample was then compared with a hair sample from the possession of the Paganini family, which had been microscopically examined in 2012 by the co-author of this paper yielding numerous figures with measurement results that had been stored and could be retrieved for direct comparison. The hair sample consisted of ten strands of hair or hair fragments and was investigated with great effort for the following parameters: exogenous hair damage, especially feeding traces caused by parasites, modeling and angulation of hairs, hair thickness, medulla and pigmentation, curling and mercury load on the trace material. After evaluation of all findings not only a non-exclusion of identity can be determined, but due to the broad match of also rare findings there is no reasonable doubt about their identity. In addition, the findings suggest that the studied hair samples are in fact from Paganini's head. The present case of Nicolò Paganini's hair lock is also an excellent starting point for reflections on the probative value of trace hair investigations. This point is also critically discussed in the paper. Finally, this study shows that said lock of hair had probably really been dedicated and given to Eliza Davenport Latham (born on 25 November 1806, died on 9 January 1877), the future wife of the, at that time, best-known and most famous English harpist John Balsir Chatterton (born on 25 November 1804, died on 9 April 1871). Paganini must have met her on his concert tour 1831/32, where he had travelled to Paris, London, the rest of England, Scotland and Ireland.
All you need is sleep
(2016)
In 21st century, the century when the humanity hopes to embark on interplanetary travel, we are yet to fully reach an understanding of our very own idiosyncratic terra incognita – the human sleep. Sleep is a highly conserved evolutionary process that constitutes approximately one third of our life, and the lack or inadequate sleep may lead to impairment across multiple cognitive domains (Tononi and Cirelli, 2014; Lim and Dinges, 2010). Sleep deprivation also leads to aberrant brain functioning, immunological and metabolic collapse, and if it is sufficiently prolonged it will ultimately lead to death (Tononi and Cirelli, 2014).
There is increasing evidence of central hyperexcitability in chronic whiplash-associated disorders (cWAD). However, little is known about how an apparently simple cervical spine injury can induce changes in cerebral processes. The present study was designed (1) to validate previous results showing alterations of regional cerebral blood flow (rCBF) in cWAD, (2) to test if central hyperexcitability reflects changes in rCBF upon non-painful stimulation of the neck, and (3) to verify our hypothesis that the missing link in understanding the underlying pathophysiology could be the close interaction between the neck and midbrain structures. For this purpose, alterations of rCBF were explored in a case-control study using H215O positron emission tomography, where each group was exposed to four different conditions, including rest and different levels of non-painful electrical stimulation of the neck. rCBF was found to be elevated in patients with cWAD in the posterior cingulate and precuneus, and decreased in the superior temporal, parahippocampal, and inferior frontal gyri, the thalamus and the insular cortex when compared with rCBF in healthy controls. No differences in rCBF were observed between different levels of electrical stimulation. The alterations in regions directly involved with pain perception and interoceptive processing indicate that cWAD symptoms might be the consequence of a mismatch during the integration of information in brain regions involved in pain processing.
A Survey of Channel Measurements and Models for Current and Future Railway Communication Systems
(2016)
The humanoid Sweaty was the finalist in this year’s robocup soccer championship(adult size). For the optimization of the gait and the stability, data concerning forces and torques in the ankle joints would be helpful. In the following paper the development of a six-axis force and torque sensor for the humanoid robot Sweaty is described. Since commercial sensors do not meet the demands for the sensors in Sweatys ankle joints, a new sensor was developed. As a measuring devices we used strain gauges and custom electronics based on an acam PS09. The geometry was analyzed with the FEM program ANSYS to get optimal dimensions for the measuring beams. In addition ANSYS was used to optimize the position for the strain gauges on the beam.
Due to its numerous application fields and benefits, virtualization has become an interesting and attractive topic in computer and mobile systems, as it promises advantages for security and cost efficiency. However, it may bring additional performance overhead. Recently, CPU virtualization has become more popular for embedded platforms, where the performance overhead is especially critical. In this article, we present the measurements of the performance overhead of the two hypervisors Xen and Jailhouse on ARM processors in the context of the heavy load “Cpuburn-a8” application and compare it to a native Linux system running on ARM processors.
Remote code attestation protocols are an essential building block to offer a reasonable system security for wireless embedded devices. In the work at hand we investigate in detail the trustability of a purely software-based remote code attestation based inference mechanism over the wireless when e.g. running the prominent protocol derivate SoftWare-based ATTestation for Embedded Devices (SWATT). Besides the disclosure of pitfalls of such a protocol class we also point out good parameter choices which allow at least a meaningful plausibility check with a balanced false positive and false negative ratio.
Proton Exchange Membrane Fuel Cells (PEMFC) are energy efficient and environmentally friendly alternatives to conventional energy conversion systems in many yet emerging applications. In order to enable prediction of their performance and durability, it is crucial to gain a deeper understanding of the relevant operation phenomena, e.g., electrochemistry, transport phenomena, thermodynamics as well as the mechanisms leading to the degradation of cell components. Achieving the goal of providing predictive tools to model PEMFC performance, durability and degradation is a challenging task requiring the development of detailed and realistic models reaching from the atomic/molecular scale over the meso scale of structures and materials up to components, stack and system level. In addition an appropriate way of coupling the different scales is required.
This review provides a comprehensive overview of the state of the art in modeling of PEMFC, covering all relevant scales from atomistic up to system level as well as the coupling between these scales. Furthermore, it focuses on the modeling of PEMFC degradation mechanisms and on the coupling between performance and degradation models.
This paper focuses on appropriately measuring the accuracy of forecasts of load behavior and renewable generation in micro-grid operation. Common accuracy measures like the root mean square of the error are often difficult to interpret for system design, as they describe the mean accuracy of the forecast. Micro-grid systems, however, have to be designed to handle also worst case situations. This paper therefore suggests two error measures that are based on the maximum function and that better allow understanding worst case requirements with respect to balancing power and balancing energy supply.
BGH "kinox.to"
(2017)
EuGH "comtech"
(2017)
Minderjährige genießen in diversen Rechtsgebieten zu Recht besonderen Schutz. Dazu gehören das allgemeine Vertragsrecht des Bürgerlichen Gesetzbuchs (BGB), das Lauterkeitsrecht des UWG1 und auch das Datenschutzrecht, wo dies in der Datenschutz-Grundverordnung (DS-GVO) ausdrücklich festgeschrieben wird. Der Beitrag diskutiert einige der relevanten Fragen.
Kommentierung des Designrechts; Kommentierung des Markenrechts; Kommentierung des Urheberrechts
(2017)
In this paper we show that a model-free approach to learn behaviors in joint space can be successfully used to utilize toes of a humanoid robot. Keeping the approach model-free makes it applicable to any kind of humanoid robot, or robot in general. Here we focus on the benefit on robots with toes which is otherwise more difficult to exploit. The task has been to learn different kick behaviors on simulated Nao robots with toes in the RoboCup 3D soccer simulator. As a result, the robot learned to step on its toe for a kick that performs 30% better than learning the same kick without toes.
Interne Revision: Anforderungen an die Wirksamkeit – ein Vergleich von IDW EPS 983 und MaRisk
(2017)
To this date, it is difficult to find high-level statistics on YouTube that paint a fair picture of the platform in its entirety. This study attempts to provide an overall characterization of YouTube, based on a random sample of channel and video data, by showing how video provision and consumption evolved over the course of the past 10 years. It demonstrates stark contrasts between video genres in terms of channels, uploads and views, and that a vast majority of on average 85% of all views goes to a small minority of 3% of all channels. The analytical results give evidence that older channels have a significantly higher probability to garner a large viewership, but also show that there has always been a small chance for young channels to become successful quickly, depending on whether they choose their genre wisely.
In this paper, we present a frame synchronization method which consists of the non-orthogonal superposition of a synchronization sequence and the data. We derive the optimum detection criterion and compare it to the classical sequential concatenation of synchronization and data sequences. Computer simulations confirm the benefits of the non-orthogonal allocation for the case of short frames, which makes this technique particularly suited for the increasingly important regime of low latency and ultra- reliable communication.
Since their dawning, space communications have been among the strongest driving applications for the development of error correcting codes. Indeed, space-to-Earth telemetry (TM) links have extensively exploited advanced coding schemes, from convolutional codes to Reed-Solomon codes (also in concatenated form) and, more recently, from turbo codes to low-density parity-check (LDPC) codes. The efficiency of these schemes has been extensively proved in several papers and reports. The situation is a bit different for Earth-to-space telecommand (TC) links. Space TCs must reliably convey control information as well as software patches from Earth control centers to scientific payload instruments and engineering equipment onboard (O/B) spacecraft. The success of a mission may be compromised because of an error corrupting a TC message: a detected error causing no execution or, even worse, an undetected error causing a wrong execution. This imposes strict constraints on the maximum acceptable detected and undetected error rates.
Within this work, the benefits of using predictive control methods for the operation of Adsorption Cooling Machines (ACMs) are shown on a simulation study. Since the internal control decisions of series-manufactured ACMs often cannot be influenced, the work focuses on optimized scheduling of an ACM considering its internal functioning as well as forecasts for load and driving energy occurrence. For illustration, an assumed solar thermal climate system is introduced and a system model suitable for use within gradient-based optimization methods is developed. The results of a system simulation using a conventional scheme for ACM scheduling are compared to the results of a predictive, optimization-based scheduling approach for the same exemplary scenario of load and driving energy occurrence. The benefits of the latter approach are shown and future actions for application of these methods for system control are addressed.
Our university carries out various research projects. Among others, the project Schluckspecht is an interdisciplinary work on different ultra-efficient car concepts for international contests. Besides the engineering work, one part of the project deals with real-time data visualization. In order to increase the efficiency of the vehicle, an online monitoring of the runtime parameters is necessary. The driving parameters of the vehicle are transmitted to a processing station via a wireless network connection. We plan to use an augmented reality (AR) application to visualize different data on top of the view of the real car. By utilizing a mobile Android or iOS device a user can interactively view various real-time and statistical data. The car and its components are meant to be augmented by various additional information, whereby that information should appear at the correct position of the components. An engine e.g. could show the current rpm and consumption values. A battery on the other hand could show the current charge level. The goal of this paper is to evaluate different possible approaches, their suitability and to expand our application to other projects at our university.
Nicht-invasives, nicht-ionisierendes funktionelles Neuroimaging mit räumlich und zeitlich hochauflösender Elektroenzephalographie oder Echtzeit-Naheinfrarotspektroskopie in Kombination mit modernen Robotorsystemen ist ein entscheidender Entwicklungsschritt auf dem Gebiet der Neuroprothetik und Brain-Machine-Interfaces. In der Medizintechnik an der Hochschule Offenburg wird hierzu geforscht.
Objectives: Speech recognition on the telephone poses a challenge for patients with cochlear implants (CIs) due to a reduced bandwidth of transmission. This trial evaluates a home-based auditory training with telephone-specific filtered speech material to improve sentence recognition. Design: Randomised controlled parallel double-blind. Setting: One tertiary referral centre. Participants: A total of 20 postlingually deafened patients with CIs. Main outcome measures: Primary outcome measure was sentence recognition assessed by a modified version of the Oldenburg Sentence Test filtered to the telephone bandwidth of 0.3-3.4 kHz. Additionally, pure tone thresholds, recognition of monosyllables and subjective hearing benefit were acquired at two separate visits before and after a home-based training period of 10-14 weeks. For training, patients received a CD with speech material, either unmodified for the unfiltered training group or filtered to the telephone bandwidth in the filtered group. Results: Patients in the unfiltered training group achieved an average sentence recognition score of 70.0%±13.6% (mean±SD) before and 73.6%±16.5% after training. Patients in the filtered training group achieved 70.7%±13.8% and 78.9%±7.0%, a statistically significant difference (P=.034, t10 =2.292; two-way RM ANOVA/Bonferroni). An increase in the recognition of monosyllabic words was noted in both groups. The subjective benefit was positive for filtered and negative for unfiltered training. Conclusions: Auditory training with specifically filtered speech material provided an improvement in sentence recognition on the telephone compared to training with unfiltered material.
BACKGROUND:
While hearing aids for a contralateral routing of signals (CROS-HA) and bone conduction devices have been the traditional treatment for single-sided deafness (SSD) and asymmetric hearing loss (AHL), in recent years, cochlear implants (CIs) have increasingly become a viable treatment choice, particularly in countries where regulatory approval and reimbursement schemes are in place. Part of the reason for this shift is that the CI is the only device capable of restoring bilateral input to the auditory system and hence of possibly reinstating binaural hearing. Although several studies have independently shown that the CI is a safe and effective treatment for SSD and AHL, clinical outcome measures in those studies and across CI centers vary greatly. Only with a consistent use of defined and agreed-upon outcome measures across centers can high-level evidence be generated to assess the safety and efficacy of CIs and alternative treatments in recipients with SSD and AHL.
METHODS:
This paper presents a comparative study design and minimum outcome measures for the assessment of current treatment options in patients with SSD/AHL. The protocol was developed, discussed, and eventually agreed upon by expert panels that convened at the 2015 APSCI conference in Beijing, China, and at the CI 2016 conference in Toronto, Canada.
RESULTS:
A longitudinal study design comparing CROS-HA, BCD, and CI treatments is proposed. The recommended outcome measures include (1) speech in noise testing, using the same set of 3 spatial configurations to compare binaural benefits such as summation, squelch, and head shadow across devices; (2) localization testing, using stimuli that rove in both level and spectral content; (3) questionnaires to collect quality of life measures and the frequency of device use; and (4) questionnaires for assessing the impact of tinnitus before and after treatment, if applicable.
CONCLUSION:
A protocol for the assessment of treatment options and outcomes in recipients with SSD and AHL is presented. The proposed set of minimum outcome measures aims at harmonizing assessment methods across centers and thus at generating a growing body of high-level evidence for those treatment options.
Die drei großen Hersteller von Cochlea-Implantat (CI)-Systemen ermöglichen es klinischen Audiologen, die Mikrofoneigenschaften der meisten CI-Sprachprozessoren zu prüfen. Dazu können bei diesen Sprachprozessoren Monitorkopfhörer angeschlossen und das/die Mikrofon(e) inklusive eines Teils der Signalvorverarbeitung abgehört werden. Präzise Angaben dazu, mit welchen Stimuli, bei welchem Pegel und nach welchem Kriterium diese Prüfung stattfinden soll, machen die CI-Hersteller nicht. Auf Basis dieser Prüfung soll der Audiologe dann über die Funktion der Mikrofone und damit darüber entscheiden, ob der betreffende Sprachprozessor an den Hersteller eingeschickt wird oder nicht.
Zur Objektivierung der CI-Sprachprozessor-Mikrofon-Prüfung haben wir eine Testbox entwickelt, mit der alle abhörbaren aktuellen CI-Sprachprozessoren der drei großen Hersteller geprüft werden können. Die Box wurde im 3D-Druck-Verfahren hergestellt. Der zu prüfende Sprachprozessor wird in die Messbox eingehängt und über einen darin verbauten Lautsprecher mit definierten Prüfsignalen (Sinustöne unterschiedlicher Frequenz) beschallt. Das Mikrofonsignal wird über das Kabel der Monitorkopfhörer herausgeführt und mit einer Shifting- and Scaling-Schaltung in einen Spannungsbereich transformiert, der für die AD-Wandlung mit einem Mikrokontroller (ATmega1280 verbaut auf einem Arduino Mega) geeignet ist. Derselbe Mikrokontroller übernimmt über einen eigens gebauten DA-Wandler die Ausgabe der Sinustöne über den Lautsprecher. Signalaufnahme und –wiedergabe erfolgen mit jeweils 38,5 kHz Samplingrate. Der für jede Frequenz über mehrere Perioden des Prüfsignals ermittelte Effektivwert wird mit dem Effektivwert, der mit einem neuwertigen Referenzprozessor für diese Frequenz gemessen wurde, verglichen. Die Messergebnisse werden graphisch auf einem Display ausgegeben.
Derzeit läuft eine erste Datenerhebung mit in der Klinik subjektiv auffällig gewordenen CI-Sprachprozessoren, die anschließend in der Messbox untersucht werden. So sollen realistische Schwellen für kritische Abweichungen von den Referenz-Effektivwerten ermittelt werden. Im weiteren Verlauf sollen dann Hit und False Alarm-Raten der subjektiven Prüfung bestimmt werden.
Das normalhörende auditorische System ist in der Lage, interaurale Zeit- bzw. Phasendifferenzen zur verbesserten Signaldetektion im Störgeräusch zu nutzen. Dieses Phänomen wird häufig als binaurale Entmaskierung bezeichnet und ist sowohl bei einfachen Signalen wie Sinustönen, als auch bei Sprachsignalen im Störgeräusch wirksam. Vorangegangene Studien haben gezeigt, dass binaurale Entmaskierung eingeschränkt auch bei bilateralen CI-Trägern beobachtbar ist (Zirn et al., 2016).
Aktuelle Ergebnisse zeigen, dass die binaurale Entmaskierung sensitiv gegenüber der bilateralen CI-Anpassung ist. So lässt sich der Effekt durch tonotopen Abgleich und Herausstellen eines apikalen Feinstrukturkanals modulieren. Steigerungen der binauralen Entmaskierung um bis zu 1,5 dB sind auf diese Weise gegenüber der konventionellen CI-Anpassung möglich. Allerdings variiert der Einfluss der CI-Anpassung interindividuell erheblich.
Online-Marketing-Controlling
(2017)
Social Media-Controlling
(2017)
Strategische Analysetechniken ermöglichen langfristig eine strukturierte Erfassung unternehmensinterner Ressourcen in Ausrichtung auf den Markt. Die hier beschriebenen Basis-Techniken umfassen das Produkt-Lebenszyklusanalyse-Modell, verschiedene Typen der Portfolio-Analyse, die Wertketten-Analyse und die SWOT-Analyse. Diese Techniken unterstützen das Marketing-Controlling, Geschäftsfeld- und Marktanalysen für das Management zu erstellen und strategische Handlungsoptionen abzuleiten.
The ability to detect a target signal masked by noise is improved in normal-hearing listeners when interaural phase differences (IPDs) between the ear signals exist either in the masker or in the signal. To improve binaural hearing in bilaterally implanted cochlear implant (BiCI) users, a coding strategy providing the best possible access to IPDs is highly desirable. Outcomes of a previous study (Zirn, Arndt et al. 2016) revealed that a subset of BiCI users showed improved IPD detection thresholds with the fine structure processing strategy FS4 compared to the constant rate strategy HDCIS using narrowband stimuli. In contrast, little differences between the coding strategies were found for broadband stimuli with regard to binaural speech intelligibility level differences (BILD) as an estimate of binaural unmasking. Compared to normalhearing listeners (7.5 ± 1.2 dB) BILD were small in BiCI users (around 0.5 dB with both coding strategies).
In the present work, we investigated the influence of binaural fitting parameters on BILD. In our cohort of BiCI users many were implanted with electrode arrays differing in length left versus right. Because this length difference typically corresponded to the distance of two electrode contacts the first modification of bilateral fitting was a tonotopic adjustment by deactivation of the most apical electrode contact on the side with the deeper inserted array (tonotopic approach).
The second modification was the isolation of the residual, most apical electrode contacts by deactivation of the basally adjacent electrode contact on each side (tonotopic sparse approach). Applying these modifications, BILD improved by up to 1.5 dB.
Kein Mensch lernt digital
(2017)
Die IT-Industrie hat die Bildung als Geschäftsfeld seit vielen Jahren auf der Agenda. Wirtschaftsverbände und IT-Vertreter fordern unisono, Digitaltechnik und Programmiersprachen schon in der Grundschule zu unterrichten, damit die Schülerinnen und Schüler für die digitale Zukunft gerüstet seien. Dabei ist der Nutzen digitaler Medien im Unterricht nach wie vor fragwürdig.
Ralf Lankau entlarvt in diesem Buch die wirtschaftlichen Interessen der IT-Branche und ihrer Lobbyisten. Dabei geht er sowohl auf die wissenschaftlichen Grundlagen (Kybernetik, Behaviorismus) als auch auf die technischen Rahmenbedingungen von Netzen und Cloud-Computing ein, bevor er konkrete Vorschläge für einen reflektierten und verantwortungsvollen Umgang mit Digitaltechnik im Unterricht skizziert. Die These des Autors lautet: Wir müssen uns auf unsere pädagogische Aufgabe besinnen und (digitale) Medien wieder zu dem machen, was sie im strukturierten Präsenzunterricht sind: didaktische Hilfsmittel.
Wer sich mit Digitalisierungsbestrebungen an Schulen befasst, stellt fest, dass die Tragweite der intendierten Transformation von Bildungseinrichtungen zu automatisierten Lernfabriken durch Digitaltechnik nur von Wenigen realisiert wird. Viele Beteiligte (wollen) glauben, es ginge nur um eine bessere technische Ausstattung der Lehreinrichtungen zur Unterstützung der Lehrkräfte – und übersehen, dass mit Kybernetik und Behaviorismus zwei den Menschen determinierende Theorien eine Renaissance erleben. Vertreter dieser Disziplinen glauben daran, dass sowohl der einzelne Mensch wie ganze Gesellschaften oder Sozialgemeinschaften wie ein Maschinenpark programmiert und gesteuert werden könne. Dabei werden Lernprozesse zu Akten der systematischen Selbstentmündigung umdefiniert: die Zurichtung der Lernenden auf abfragbare Kompetenzen mit Hilfe von Algorithmen und Software.
Kinder am Bildschirm
(2017)
Wenn in Studien über digitale Medien und Kinder berichtet wird, liegt der Fokus meist darauf, Kinder „fit für die digitale Zukunft“ zu machen. Welche Folgen eine zu frühe und nicht regulierte Nutzung von Bildschirmmedien bei Kindern und Jugendlichen haben kann, zeigt sich in den Praxen der Kinderärzte. Das Spektrum der zu behandelnden Erkrankungen und der Beratungsbedarf bei psychosozialen Problemen haben sich in den letzten Jahren grundlegend verändert, schreiben die Verantwortlichen der BLIKK-Studie.
Wie man die Vorlesung "Technische Mechanik 1 - Statik" für alle Beteiligten dynamisch gestaltet
(2017)
Lehrende nehmen vielfältige Veränderungen, insbesondere bei Studienanfängern wahr: Vorkenntnisse, Aufnahme- und Konzentrationsfähigkeit werden zunehmend heterogener. In der Vorlesung „Technische Mechanik 1“ wurde darauf konstruktiv reagiert, indem der Ablauf und die Struktur verändert wurden. Aufgaben und ihre Lösungen stehen im Mittelpunkt des Unterrichts. Neben der Lehrenden als aktiv Handelnde wird jeder Studierende im Lauf des Semesters in den Ablauf integriert und muss individuelle Lösungen der verteilten Aufgaben präsentieren. Im Vergleich entwickeln die Studierenden durch „Lernen am Modell“ dadurch ihre methodischen und fachlichen Fähigkeiten weiter. Um den Studierenden die Relevanz der behandelten Themenbereiche zu verdeutlichen wurden spezielle Aufgaben mit einem lebensweltlichem Bezug entwickelt. Befragungen zeigen, dass die Studierenden von den vielfältigen interaktiven Lernangeboten profitieren und die entwickelten Kompetenzen auch auf andere Lernsituationen übertragen.
We present a two-dimensional (2D) planar chromatographic separation of estrogenic active compounds on RP-18 W (Merck, 1.14296) phase. A mixture of 8 substances was separated using a solvent mix consisting of hexane, ethyl acetate, acetone (55:15:10, v/v) in the first direction and of acetone and water (15:10, v/v) in the second direction. Separation was performed on an RP-18 W plate over a distance of 70 mm. This 2D-separation method can be used to quantify 17α-ethinylestradiol (EE2) in an effect-directed analysis, using the yeast strain Saccharomyces cerevisiae BJ3505. The test strain (according to McDonnell) contains the estrogen receptor. Its activation by estrogen active compounds is measured by inducing the reporter gene lacZ which encodes the enzyme β-galactosidase. This enzyme activity is determined on plate by using the fluorescent substrate MUG (4-methylumbelliferyl-β-d-galactopyranoside).
Finding clusters in high dimensional data is a challenging research problem. Subspace clustering algorithms aim to find clusters in all possible subspaces of the dataset where, a subspace is the subset of dimensions of the data. But exponential increase in the number of subspaces with the dimensionality of data renders most of the algorithms inefficient as well as ineffective. Moreover, these algorithms have ingrained data dependency in the clustering process, thus, parallelization becomes difficult and inefficient. SUBSCALE is a recent subspace clustering algorithm which is scalable with the dimensions and contains independent processing steps which can be exploited through parallelism. In this paper, we aim to leverage, firstly, the computational power of widely available multi-core processors to improve the runtime performance of the SUBSCALE algorithm. The experimental evaluation has shown linear speedup. Secondly, we are developing an approach using graphics processing units (GPUs) for fine-grained data parallelism to accelerate the computation further. First tests of the GPU implementation show very promising results.