Refine
Year of publication
Document Type
- Conference Proceeding (361) (remove)
Conference Type
- Konferenzartikel (234)
- Konferenz-Abstract (85)
- Sonstiges (25)
- Konferenz-Poster (13)
- Konferenzband (6)
Language
- English (361) (remove)
Keywords
- RoboCup (32)
- Machine Learning (9)
- injury (9)
- Biomechanik (6)
- Ausbildung (5)
- biomechanics (5)
- running (5)
- ACL (4)
- Additive Manufacturing (4)
- Deep Leaning (4)
- Design (4)
- Herzkrankheit (4)
- Optik (4)
- Produktion (4)
- Roboter (4)
- eco-innovation (4)
- machine learning (4)
- Artificial Intelligence (3)
- Communication Systems (3)
- Education (3)
- Generative Adversarial Network (3)
- Information Systems (3)
- Katheter (3)
- Licht (3)
- Mobile Learning (3)
- Photonik (3)
- Robustness (3)
- Sound Synthesis (3)
- TRIZ (3)
- VR (3)
- deep learning (3)
- research-oriented education (3)
- sport (3)
- Abtragung (2)
- Additive Tooling (2)
- Alexander von Humboldt (2)
- Android (2)
- Brennstoffzelle (2)
- CST (2)
- Computer Vision (2)
- Deep Learning (2)
- E-Learning (2)
- Education in Optics and Photonics (2)
- Energieversorgung (2)
- Gamification (2)
- Generative Art (2)
- HF-Ablation (2)
- Heart rhythm model (2)
- Herzschrittmacher (2)
- Humanoid Robots (2)
- Improvisation (2)
- Intelligentes Stromnetz (2)
- Maschinenbau (2)
- Mobile Applications (2)
- Modeling and simulation (2)
- Netzwerk (2)
- Neural networks (2)
- Optics and Photonics (2)
- Physik (2)
- Radiologie (2)
- Security (2)
- Sensortechnik (2)
- Signaltechnik (2)
- Simulation (2)
- Simulation-based Interaction (2)
- Smart Grid (2)
- Stability (2)
- Virtuelle Realität (2)
- Visual Programming (2)
- Wissenschaft (2)
- accountability (2)
- artificial dancer (2)
- artificial intelligence (2)
- atrial fibrillation (2)
- autoattack (2)
- cardiac ablation (2)
- content adaptation (2)
- cybersecurity (2)
- dance and technology (2)
- design methods (2)
- device independent learning (2)
- e-learning (2)
- efficient training (2)
- environmental education (2)
- explainability (2)
- fairness (2)
- heart rhythm model (2)
- interactive visualization (2)
- learning scenarios (2)
- motion synthesis (2)
- optics and photonics (2)
- overuse (2)
- process engineering (2)
- responsibility (2)
- sustainability (2)
- trust (2)
- understandability (2)
- 1.5-Degree target (1)
- 3D User Interface (1)
- 3D analysis (1)
- 3D bin picking (1)
- 3D print (1)
- 3D printing (1)
- 3D virtual reality (1)
- 3D-Modelling (1)
- 5G (1)
- AI aided Innovation (1)
- AR (1)
- AV nodal reentry tachycardia (1)
- AV reentrant tachycardia (1)
- Actuators (1)
- Adversarial Attacks (1)
- Adversarial Robustness (1)
- Aflatoxin (1)
- Ageing (1)
- Aliasing (1)
- Archives (1)
- Art and Photonics (1)
- Artistic Research (1)
- Assistive Technologies (1)
- Astronomical events (1)
- Astronomical optics (1)
- Audiovisual Performance (1)
- AudiovisualPerformance (1)
- Augmented Reality (1)
- Augmented reality (1)
- Austausch (1)
- Automated Ideation (1)
- Automation (1)
- Automotive engineering (1)
- Batteries (1)
- Bildverarbeitung (1)
- Bio-based materials (1)
- Bio-based plastics (1)
- Biogas (1)
- Bioimpedance measurement (1)
- Biomedizin (1)
- Blended Learning (1)
- Blockchain (1)
- Boiler (1)
- Building energy efficiency (1)
- CAAD (1)
- CAD (1)
- CAD-System (1)
- CNNs (1)
- CRT (1)
- Car-to-Car-(C2C)-Communication (1)
- Cardiac Resynchronization Therapy (1)
- Challenges in Action Recognition (1)
- Climate change (1)
- Cloud Security (1)
- Cloud Service Provider (1)
- Cloud User (1)
- Coal phase-out (1)
- Collaboration of Academia and Industry (1)
- Computersicherheit (1)
- Context-awareness (1)
- Context-based Services (1)
- Creativity (1)
- Crowdsourcing-based ideation (1)
- Cryoballoon catheter ablation (1)
- Curricular concepts (1)
- Cyber Physical Systems, (1)
- DMD (1)
- Deaf-Blindness (1)
- Defibrillator (1)
- Degradability (1)
- Demand side flexibility (1)
- Design , Produktgestaltung (1)
- Design Methods (1)
- Design education (1)
- Dienstleistung (1)
- Digital Library (1)
- Digitalisierung (1)
- Digitalization (1)
- Dimension 3 (1)
- Druck (1)
- ERP (1)
- Ecercises (1)
- Ecodesign (1)
- Edge AI (1)
- Eigenvalues (1)
- Eingebettetes System (1)
- Electrochemistry (1)
- Elektrofahrzeug (1)
- Elektrokardiogramm (1)
- Embedded AI (1)
- Embedded Software (1)
- Embedded Systems (1)
- Energiewirtschaft (1)
- Energy Market (1)
- Energy Planning (1)
- Energy Storage Systems (1)
- Energy System Analysis (1)
- Energy System Optimization (1)
- Energy management (1)
- Energy policy (1)
- Energy systems modeling (1)
- Engineering Creativity (1)
- Engineering creativity (1)
- Engineering education in the age of digitalization (1)
- Enzym (1)
- Erweiterte Realität <Informatik> (1)
- Erziehung (1)
- Esophageal catheter (1)
- Eye Tracking (1)
- Eye tracking (1)
- Fahrzeug (1)
- Faseroptik (1)
- Fault detection (1)
- Federated Learning (1)
- Fiber reinforcement (1)
- Flashcards (1)
- Flüssigkristall (1)
- GIS (1)
- Gasanalyse (1)
- Gebäude (1)
- Generative Design (1)
- Gesellschaft (1)
- Gestaltung (1)
- Google Maps (1)
- Grid Integration (1)
- HR (1)
- Haustechnik (1)
- Head-mounted displays (1)
- Heart Rhythm Simulation (1)
- Heat pumps (1)
- Helmholtz coil (1)
- Hemodynamic monitoring (1)
- Herstellung (1)
- Herzmuskelkrankheit (1)
- His-Bundle Pacing (1)
- Hochfrequenztechnik (1)
- Human Resources (1)
- Humanoider Roboter (1)
- IDL (1)
- IVD (1)
- IYL (1)
- Immersive Technology (1)
- InceptionTime (1)
- Industrie 4.0 (1)
- Injection Molds (1)
- Innovation (1)
- Integrated product development (1)
- Intelligent Buildings (1)
- Interactive Film (1)
- International Day of Light (1)
- International Day of Light, IDL (1)
- International Year of Light (1)
- International Year of Light, IYL (1)
- Internet der Dinge (1)
- Internet of Things (1)
- Internet portal (1)
- KAM (1)
- Karbon (1)
- Kernspintomografie (1)
- Klein- und Mittelbetrieb (1)
- Knowledge-based Innovation (1)
- Knowledge-based innovation (1)
- Kommunikation (1)
- Kryptographie (1)
- LPWAN (1)
- Landau-Lifshitz-Gilbert equations (1)
- Language learning (1)
- Leap Motion Controller (1)
- Left Atrial Appendage Closure (1)
- Lehre (1)
- Lightweight design (1)
- Lithium-ion battery (1)
- Live Broadcasting (1)
- Location-Based Services (1)
- Location-based Services (1)
- M-learning (1)
- MEMS (1)
- MLOps (1)
- MPC (1)
- Machine learning (1)
- Machine-to- Machine-(M2M)-Communication (1)
- Makespan (1)
- Management (1)
- Medizintechnik (1)
- Methanol (1)
- Microelectronics (1)
- Mikrocontroller (1)
- Mode Collapse (1)
- Model Calibration (1)
- Model Search (1)
- Modelling (1)
- Movement (1)
- Multi-Material 3D-Printing (1)
- NB-IoT (1)
- Natural fibers (1)
- Naturwissenschaften (1)
- Network Test (1)
- New Product Development (1)
- New product development (1)
- Nickel (1)
- Niedrige Energie (1)
- Nyquist-Shannon (1)
- Object-Based Services (1)
- Octave Convolution (1)
- Onboarding (1)
- Online Student Services (1)
- Online simulation (1)
- Ontology-based Classification (1)
- Optimization and control (1)
- Oxidation (1)
- Particle emissions (1)
- Pattern Recognition (1)
- Phontonik (1)
- Photonics (1)
- Photovoltaic (1)
- Physiological Pacing (1)
- Physiotherapy (1)
- Plant commissioning (1)
- Planung (1)
- PolyJet Modelling (1)
- Predictive Maintenance (1)
- Privatsphäre (1)
- Problemlösen (1)
- Process intensification (1)
- Product Design (1)
- Produktentwicklung (1)
- Prototyp (1)
- Qualität (1)
- Qualitätskontrolle (1)
- Qualitätssicherung (1)
- Realtime simulation (1)
- Recommendation Techniques (1)
- Regularization (1)
- Rehabilitation (1)
- Representation Learning (1)
- ResNet (1)
- Road-Quality Prediction (1)
- Robot Applications (1)
- Robotic Soccer (1)
- Robotics (1)
- Robots (1)
- Russian Ukrainian war (1)
- Sampling (1)
- Second-order Optimization (1)
- Sensors (1)
- Sensortechniik (1)
- Sicherheitstechnik (1)
- Smart wearables (1)
- Social Robots (1)
- Software (1)
- Soziale Roboter (1)
- Spinal cord stimulation (1)
- Supraventricular tachycardia (1)
- Synchronisierung (1)
- TLS (1)
- TRIZ Inventive Principles (1)
- TRIZ methodology (1)
- TTCN3 (1)
- Technische Mechanik (1)
- Telemetrie (1)
- Testumgebung (1)
- Thermische Solaranlage (1)
- Time-series Classification (1)
- TinyML (1)
- Traceability (1)
- Transistor (1)
- Umwelt (1)
- Unsupervised Conditional Training (1)
- Unsupervised Learning (1)
- Unternehmensführung (1)
- User behavior (1)
- Variational Autoencoders (1)
- Vehicle safety (1)
- Verfahren (1)
- Vergleich (1)
- Versorgung (1)
- Verwaltung (1)
- Virtual Research Environment (1)
- Virtual experiments (1)
- Virtual reality (1)
- Visualization (1)
- Voxel (1)
- Voxelization (1)
- Wasserstand (1)
- Wearables (1)
- Wirtschaftswissenschaften (1)
- Wärmepumpe (1)
- Zusatzstoff (1)
- accelerometer (1)
- accuracy (1)
- acoustic phonons (1)
- adversarial (1)
- algorithm-based data analysis (1)
- atrial flutter (1)
- attribute manipulation (1)
- autoML (1)
- benchmarking (1)
- biomechanical stimulation (1)
- biomimetics (1)
- business applications (1)
- cantilever (1)
- cifar (1)
- cloud (1)
- cloud computing (1)
- cluster (1)
- collaborative learning (1)
- content formatting (1)
- content synchronization (1)
- cross-industry innovation (1)
- cryptography (1)
- curricular concepts (1)
- data architecture (1)
- data migration (1)
- data model (1)
- defense (1)
- design education (1)
- device detection (1)
- device independency (1)
- dickkopf 3 (1)
- differential mode-delay (1)
- digital games (1)
- digital library (1)
- dynamic hyperlinks (1)
- echocardiography (1)
- education and research (1)
- energy harvesting (1)
- equivalent circuit model (1)
- ethical frameworks (1)
- evaluation (1)
- exchange magnons (1)
- face editing (1)
- ferromagnetic resonance (1)
- flashcard (1)
- flexible job shop (1)
- force (1)
- force controlled robot (1)
- fourier (1)
- gamification (1)
- gan (1)
- generalized content (1)
- generation Y (1)
- generation Z (1)
- genetic algorithms (1)
- glass (1)
- grey-box model (1)
- gyroscope (1)
- heart rhythm simulation (1)
- heating and cooling (1)
- high-speed cameras (1)
- humanoid robot walking (1)
- imagenet (1)
- impedance cardiography (1)
- industry (1)
- inertial measurement unit (1)
- information network (1)
- innovation management (1)
- interactive media (1)
- interconnected data (1)
- joint torque sensors (1)
- language learning (1)
- laser material processing (1)
- laser triangulation (1)
- lid (1)
- lithium-ion battery (1)
- load profiles (1)
- local electricity markets (1)
- loneliness (1)
- m-learning (1)
- magnetization dynamics (1)
- magneto-acoustics (1)
- mahalanobis (1)
- maintenance scheduling (1)
- master data (1)
- measurement (1)
- media technology (1)
- media tyechnolog (1)
- mental health apps (1)
- metaverse (1)
- mobile learning (1)
- mobile web design guidelines (1)
- multi-material (1)
- multidimensional flashcard (1)
- multimode fibre (1)
- multimode fibre connectors (1)
- neural architecture search (1)
- neural ordinary differential equations (1)
- new product development (1)
- non interfering measurements (1)
- offensive security techniques (1)
- optical on-line/in-line measurements (1)
- optimization (1)
- peer-to-peer energy trading (1)
- pigment paste (1)
- powder filled gripper (1)
- power distribution (1)
- presentation skills (1)
- printing technologies (1)
- project-based learning (1)
- pruning (1)
- pulmonary vein isolation (1)
- quality function deployment (QFD) (1)
- railway system (1)
- reference data (1)
- remanufacturing (1)
- remote laboratory (1)
- repeatability (1)
- risk factor (1)
- robotics (1)
- scanning electron microscope (SEM) (1)
- scheduling (1)
- service engineering (1)
- service research (1)
- signal averaging (1)
- simulation (1)
- smart grid (1)
- sparse backpropagation (1)
- spatial imagination (1)
- spectral defense (1)
- spectraldefense (1)
- style transfer (1)
- surface treatment (1)
- synthetical profiles (1)
- taxonomy (1)
- teaching and learning culture (1)
- technical drawings (1)
- temperature dependency (1)
- transversal skills (1)
- ultrafast laser interaction with materials (1)
- vacuum (1)
- ventricular tachycardia (1)
- vibration (1)
- virtual reality (1)
- virtual worlds (1)
- wavelet transformation (1)
- Überwachung (1)
Institute
- Fakultät Elektrotechnik und Informationstechnik (E+I) (bis 03/2019) (110)
- Fakultät Maschinenbau und Verfahrenstechnik (M+V) (106)
- Fakultät Elektrotechnik, Medizintechnik und Informatik (EMI) (ab 04/2019) (63)
- Fakultät Medien und Informationswesen (M+I) (bis 21.04.2021) (37)
- Fakultät Wirtschaft (W) (37)
- IMLA - Institute for Machine Learning and Analytics (29)
- INES - Institut für nachhaltige Energiesysteme (26)
- Fakultät Medien (M) (ab 22.04.2021) (17)
- ivESK - Institut für verlässliche Embedded Systems und Kommunikationselektronik (17)
- ACI - Affective and Cognitive Institute (8)
Open Access
- Open Access (361) (remove)
In bimodal cochlear implant (CI) / hearing aid (HA) users a constant interaural time delay in the order of several milliseconds occurs due to differences in signal processing of the devices. For MED-EL CI systems in combination with different HA types, we have quantified the respective device delay mismatch (Zirn et al. 2015). In the current study, we investigate the effect of the device delay mismatch in simulated and actual bimodal listeners on sound localization accuracy.
To deal with the device delay mismatch in actual bimodal listeners we delayed the CI stimulation according to the measured HA processing delay and two other values. With all delay values highly significant improvements of the rms error in the localization task were observed compared to the test without the delay. The results help to narrow down the optimal patient-specific delay value.
The ability to detect a target signal masked by noise is improved in normal-hearing listeners when interaural phase differences (IPDs) between the ear signals exist either in the masker or in the signal. To improve binaural hearing in bilaterally implanted cochlear implant (BiCI) users, a coding strategy providing the best possible access to IPDs is highly desirable. Outcomes of a previous study (Zirn, Arndt et al. 2016) revealed that a subset of BiCI users showed improved IPD detection thresholds with the fine structure processing strategy FS4 compared to the constant rate strategy HDCIS using narrowband stimuli. In contrast, little differences between the coding strategies were found for broadband stimuli with regard to binaural speech intelligibility level differences (BILD) as an estimate of binaural unmasking. Compared to normalhearing listeners (7.5 ± 1.2 dB) BILD were small in BiCI users (around 0.5 dB with both coding strategies).
In the present work, we investigated the influence of binaural fitting parameters on BILD. In our cohort of BiCI users many were implanted with electrode arrays differing in length left versus right. Because this length difference typically corresponded to the distance of two electrode contacts the first modification of bilateral fitting was a tonotopic adjustment by deactivation of the most apical electrode contact on the side with the deeper inserted array (tonotopic approach).
The second modification was the isolation of the residual, most apical electrode contacts by deactivation of the basally adjacent electrode contact on each side (tonotopic sparse approach). Applying these modifications, BILD improved by up to 1.5 dB.
The ability to detect a signal masked by noise is improved in normal-hearing (NH) listeners when interaural phase differences (IPD) between the ear signals exist either in the masker or the signal. We determined the impact of different coding strategies in bilaterally implanted cochlear implant (BiCI) users with and without fine-structure coding (FSC) on masking level differences. First, binaural intelligibility level differences (BILD) were determined in NH listeners and BiCI users using their clinical speech processors. NH subjects (n=8) showed a significant mean BILD of 7.5 dB. In contrast, BiCI users (n=9) without FSC as well as with FSC revealed a barely significant mean BILD (0.4 dB respectively 0.6 dB). Second, IPD thresholds were measured in BiCI users using either their speech processors with FS4 or direct stimulation with FSC. With the latter approach, synchronized stimulation providing an interaural accuracy of stimulation timing of 1.67 µs was realized on pitch matched electrode pairs. The resulting individual IPD threshold was lower in most of the subjects with direct stimulation than with their speech processors. These outcomes indicate that some BiCI users can benefit from increased temporal precision of interaural FSC and adjusted interaural frequency-place mapping presumably resulting in improved BILD.
BiCI users’ sensitivity to interaural phase differences for single- and multi-channel stimulation
(2016)
The main advantage of mobile context-aware applications is to provide effective and tailored services by considering the environmental context, such as location, time, nearby objects and other data, and adapting their functionality according to the changing situations in the context information without explicit user interaction. The idea behind Location-Based Services (LBS) and Object-Based Services (OBS) is to offer fully-customizable services for user needs according to the location or the objects in a mobile user's vicinity. However, developing mobile context-aware software applications is considered as one of the most challenging application domains due to the built-in sensors as part of a mobile device. Visual Programming Languages (VPL) and hybrid visual programming languages are considered to be innovative approaches to address the inherent complexity of developing programs. The key contribution of our new development approach for location and object-based mobile applications is a use case driven development approach based on use case templates and visual code templates to enable even programming beginners to create context-aware mobile applications. An example of the use of the development approach is presented and open research challenges and perspectives for further development of our approach are formulated.
Sensors and actuators enable creation of context-aware applications in which applications can discover and take advantage of contextual information, such as user location, nearby people and objects. In this work, we use a general context definition, which can be applied to various devices, e.g., robots and mobile devices. Developing context-based software applications is considered as one of the most challenging application domains due to the sensors and actuators as part of a device. We introduce a new development approach for context-based applications by using use-case descriptions and Visual Programming Languages (VPL). The introduction of web-based VPLs, such as Scratch and Snap, has reinvigorated the usefulness of VPLs. We provide an in-depth discussion of our new VPL based method, a step by step development process to enable development of context-based applications. Two case studies illustrate how to apply our approach to different problem domains: Context-based mobile apps and context-based humanoid robot applications.
The aim of the smart grid is to achieve more efficient, distributed and secure supply of energy over the traditional power grid by using a bidirectional information flow between the grid agents (e.g. generator node, customer). One of the key optimization problems in smart grid is to produce power among generator nodes with a minimum cost while meeting the customer demand, known as Economic Dispatch Problem (EDP). In recent years, many distributed approaches to solve EDP have been proposed. However, protecting the privacy-sensitive data of individual generator nodes has been largely overlooked in the existing solutions. In this work, we show an attack against an existing auction-based EDP protocol considering a non-colluding semi-honest adversary. We briefly introduce our approach to a practical privacy-preserving EDP solution as our work in progress.
Digital, virtual environments and the metaverse are rapidly taking shape and will generate disruptive changes in the areas of ethics, privacy, safety, and how the relationships between human beings will be developed. To uncover some of some of the implications that will impact those areas, this study investigates the perceptions of 101 younger people from the generations Y and Z. We present a first exploratory analysis of the findings, focusing on knowledge and self-perception. Results show that these young generations are seriously doubting their knowledge on the metaverse and virtual worlds – regarding both the definition and the usage. It is interesting to see only a medium confidence level, considering that the participants are young and from an academic environment, which should increase their interest in and the affinity towards virtual worlds. Males from both generations perceive themselves as significantly more knowledgeable than females. Regarding a fitting definition, almost 40% agreed on the metaverse as a “universal and immersive virtual world that is made accessible using virtual reality and augmented reality technologies”. Regarding the topic in general, several participants (almost 40%) considered themselves sceptics or “just” users (38%). Interestingly, generation Y participants were more likely than the younger generation Z participants to identify themselves as early adopters or innovators. In result, the considerable amount of “mixed feelings” regarding digital, virtual environments and the metaverse shows that in-depth studies on the perception of the metaverse as well as its ethical and integrity implications are required to create more accessible, inclusive, safe, and inclusive digital, virtual environments.
A system for the on-line/in-line measurement of soot particle sizes and concentrations in the undiluted exhaust gas of diesel engines was developed and successfully tested. The unit uses the individual attenuations of three different laser wavelengths and is combined with an optical cell (white principle) with adjustable path lengths from 2.5 to 15 meters.
Time Resolved Measurements of Soot Concentrations and Mean Particle Sizes during EUDC and ECE Cycles
(2002)
The overview of public key infrastructure based security approaches for vehicular communications
(2015)
Modern transport infrastructure becomes a full member of globally connected network. Leading vehicle manufacturers have already triggered development process, output of which will open a new horizon of possibilities for consumers and developers by providing a new communication entity - a car, thus enabling Car2X communications. Nevertheless some of available systems already provide certain possibilities for vehicles to communicate, most of them are considered not sufficiently secured. During last 15 years a number of big research projects funded by European Union and USA governments were started and concluded after which a set of standards were published prescribing a common architecture for Car2X and vehicles onboard communications. This work concentrates on combining inner and outer vehicular communications together with a use of Public Key Infrastructure (PKI).
Wireless communication systems more and more become part of our daily live. Especially with the Internet of Things (IoT) the overall connectivity increases rapidly since everyday objects become part of the global network. For this purpose several new wireless protocols have arisen, whereas 6LoWPAN (IPv6 over Low power Wireless Personal Area Networks) can be seen as one of the most important protocols within this sector. Originally designed on top of the IEEE802.15.4 standard it is a subject to various adaptions that will allow to use 6LoWPAN over different technologies; e.g. DECT Ultra Low Energy (ULE). Although this high connectivity offers a lot of new possibilities, there are several requirements and pitfalls coming along with such new systems. With an increasing number of connected devices the interoperability between different providers is one of the biggest challenges, which makes it necessary to verify the functionality and stability of the devices and the network. Therefore testing becomes one of the key components that decides on success or failure of such a system. Although there are several protocol implementations commonly available; e.g., for IoT based systems, there is still a lack of according tools and environments as well as for functional and conformance testing. This article describes the architecture and functioning of the proposed test framework based on Testing and Test Control Notation Version 3 (TTCN-3) for 6LoWPAN over ULE networks.
The CAN bus still is an important fieldbus in various domains, e.g. for in-car communication or automation applications. To counter security threats and concerns in such scenarios we design, implement, and evaluate the use of an end-to-end security concept based on the Transport Layer Security protocol. It is used to establish authenticated, integrity-checked, and confidential communication channels between field devices connected via CAN. Our performance measurements show that it is possible to use TLS at least for non time-critical applications, as well as for generic embedded networks.
Practical exercises are a crucial part of many curricula. Even simple exercises can improve the understanding of the underlying subject. Most experimental setups require special hardware. To carry out e. g. a lens experiments the students need access to an optical bench, various lenses, light sources, apertures and a screen. In our previous publication we demonstrated the use of augmented reality visualization techniques in order to let the students prepare with a simulated experimental setup. Within the context of our intended blended learning concept we want to utilize augmented or virtual reality techniques for stationary laboratory exercises. Unlike applications running on mobile devices, stationary setups can be extended more easily with additional interfaces and thus allow for more complex interactions and simulations in virtual reality (VR) and augmented reality (AR). The most significant difference is the possibility to allow interactions beyond touching a screen. The LEAP Motion controller is a small inexpensive device that allows for the tracking of the user’s hands and fingers in three dimensions. It is conceivable to allow the user to interact with the simulation’s virtual elements by the user’s very hand position, movement and gesture. In this paper we evaluate possible applications of the LEAP Motion controller for simulated experiments in augmented and virtual reality. We pay particular attention to the devices strengths and weaknesses and want to point out useful and less useful application scenarios. © (2016) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
In many scientific studies lens experiments are part of the curriculum. The conducted experiments are meant to give the students a basic understanding for the laws of optics and its applications. Most of the experiments need special hardware like e.g. an optical bench, light sources, apertures and different lens types. Therefore it is not possible for the students to conduct any of the experiments outside of the university’s laboratory. Simple optical software simulators enabling the students to virtually perform lens experiments already exist, but are mostly desktop or web browser based.
Augmented Reality (AR) is a special case of mediated and mixed reality concepts, where computers are used to add, subtract or modify one’s perception of reality. As a result of the success and widespread availability of handheld mobile devices, like e.g. tablet computers and smartphones, mobile augmented reality applications are easy to use. Augmented reality can be easily used to visualize a simulated optical bench. The students can interactively modify properties like e.g. lens type, lens curvature, lens diameter, lens refractive index and the positions of the instruments in space. Light rays can be visualized and promote an additional understanding of the laws of optics. An AR application like this is ideally suited to prepare the actual laboratory sessions and/or recap the teaching content.
The authors will present their experience with handheld augmented reality applications and their possibilities for light and optic experiments without the needs for specialized optical hardware.
Activities for rehabilitation and prevention are often lengthy and associated with pain and frustration. Their playful enrichment (hereafter: gamification) can counteract this, resulting in so-called “exergames”. However, in contrast to games designed solely for entertainment, the increased motivation and immersion in gamified training can lead to a reduced perception of pain and thus to health deterioration. Therefore, it is necessary to monitor activities continuously. However, only an AI-based system able to generate autonomous interventions could vacate the therapists’ costly time and allow better training at home. An automated adjustment of the movement training’s difficulty as well as individualized goal setting and control are essential to achieve such autonomy. This article’s contribution is two-fold: (1) We portray the potentials of gamification in the health area. (2) We present a framework for smart rehabilitation and prevention training allowing autonomous, dynamic, and gamified interactions.
The purpose of this study was to describe the effects of running speed and slope on metatarsophalangeal (MTP) joint kinematics. 22 male and female runners underwent 3D motion analysis on an instrumented treadmill at three different speeds (2.5 m/s, 3.0 m/s, 3.5 m/s). At each speed, participants ran at seven slope conditions (downhill: -15%, -10%, -5%, level, and uphill: +5%, +10%, +15%). We found a significant main effect (p < 0.001) of running speed and slope on peak MTP dorsiflexion and a running speed by slope interaction effect (p < 0.001) for peak MTP dorsiflexion velocity. These findings highlight the need to consider running intensity and environmental factors like running surface inclination when considering MTP joint mechanics and technological aids to support runners.
Established robot manufacturers have developed methods to determine and optimize the accuracy of their robots. These methods vary from robot manufacturers to their competitors. Due to the lack of published data, a comparison of robot performance is difficult. The aim of this article is to find methods to evaluate important characteristics of a robot with an accurate and cost-effective setup. A laser triangulation sensor and geometric referenced spheres were used as a base to compare the robot performance.
Objective: Dickkopf 3 (DKK3) has been identified as a urinary biomarker. Values above 4000 pg/mg creatinine (Cr) were linked with a higher risk of short-term decline of kidney function (J Am Soc Nephrol 29: 2722–2733). However, as of today, there is little experience with DKK3 as a risk marker in everyday clinical practice. We used algorithm-based data analysis to evaluate the potential dependence of DKK3 in a cohort from a large single center in Germany.
Method: DKK3 was measured in all CKD patients in our center October 1 st 2018 till Dec. 31 2019, together with calculated GFR (eGFR) and urinary albumin/creatinine ratio (UACR). Kidney transplant patients were excluded. Until the end of follow-up Dec 31 st 2021, repeated measurements were performed for all parameters. Data analysis was performed using MD-Explorer (BioArtProducts, Rostock, Germany) and Python with multiple libraries. Linear regression models were applied in patients for DKK3, eGFR and UACR. Comparison of the models was performed with a twosided Kolmogorov-Smirnov test.
Results: 1206 DKK3 measurements were performed in 1103 patients (621 male, age 70yrs, eGFR 29,41 ml/min/1.73qm, UACR 800 mg/g). 134 patients died during follow-up. DKK3 mean was 2905 pg/mg Cr (max. 20000, 75 % percentile 3800). 121 pts had DKK3 > 4000. At the end of follow-up 7 % of patients with DKK3 < 4000 (initial eGFR 17.6) versus 39.6 % of patients with DDK3 > 4000 (initial eGFR 15.7) underwent dialysis. Compared to eGFR and UACR at baseline, DKK3 > 4000 performed best to predict eGFR loss over the next 12 months.
Conclusion: In this cohort of CKD patients, DKK3 > 4000 at baseline predicted the eGFR slope better than eGFR or UACR at baseline. DKK3 > 4000 reflected a higher risk of progression towards ESRD in patients with similar baseline eGFR levels.
The visualization of heart rhythm disturbance and atrial fibrillation therapy allow the optimization of new cardiac catheter ablations. With the simulation software CST (Computer Simulation Technology, Darmstadt) electromagnetic and thermal simulations can be carried out to analyze and optimize different heart rhythm disturbance and cardiac catheters for pulmonary vein isolation. Another form of visualization is provided by haptic, three-dimensional print models. These models can be produced using an additive manufacturing method, such as a 3D printer. The aim of the study was to produce a 3D print of the Offenburg heart rhythm model with a representation of an atrial fibrillation ablation procedure to improve the visualization of simulation of cardiac catheter ablation.
The basis of 3D printing was the Offenburg heart rhythm model and the associated simulation of cryoablation of the pulmonary vein. The thermal simulation shows the pulmonary vein isolation of the left inferior pulmonary vein with the cryoballoon catheter Arctic Front AdvanceTM from Medtronic. After running through the simulation, the thermal propagation during the procedure was shown in the form of different colors. The three-dimensional print models were constructed on the base of the described simulation in a CAD program. Four different 3D printers are available for this purpose in a rapid prototyping laboratory at the University of Applied Science Offenburg. Two different printing processes were used: 1. a binder jetting printer with polymer gypsum and 2. a multi-material printer with photopolymer. A final print model with additional representation of the esophagus and internal esophagus catheter was also prepared for printing.
With the help of the thermal simulation results and the subsequent evaluation, it was possible to make a conclusion about the propagation of the cold emanating from the catheter in the myocardium and the surrounding tissue. It could be measured that already 3 mm from the balloon surface into the myocardium the temperature drops to 25 °C. The simulation model was printed using two 3D printing methods. Both methods as well as the different printing materials offer different advantages and disadvantages. While the first model made of polymer gypsum can be produced quickly and cheaply, the second model made of photopolymer takes five times longer and was twice as expensive. On the other hand, the second model offers significantly better properties and was more durable overall. All relevant parts, especially the balloon catheter and the conduction, are realistically represented. Only the thermal propagation in the form of different colors is not shown on this model.
Three-dimensional heart rhythm models as well as virtual simulations allow a very good visualization of complex cardiac rhythm therapy and atrial fibrillation treatment methods. The printed models can be used for optimization and demonstration of cryoballoon catheter ablation in patients with atrial fibrillation.
Strings
(2020)
This article presents the currently ongoing development of an audiovisual performance work with the title Strings. This work provides an improvisation setting for a violinist, two laptop performers, and two generative systems. At the core of Strings lies an approach that establishes a strong correlation among all participants by means of a shared physical principle. The physical principle is that of a vibrating string. The article discusses how this principle is used in both natural and simulated forms as main interaction layer between all performers and as natural or generative principle for creating audio and video.
Security in IT systems, particularly in embedded devices like Cyber Physical Systems (CPSs), has become an important matter of concern as it is the prerequisite for ensuring privacy and safety. Among a multitude of existing security measures, the Transport Layer Security (TLS) protocol family offers mature and standardized means for establishing secure communication channels over insecure transport media. In the context of classical IT infrastructure, its security with regard to protocol and implementation attacks has been subject to extensive research. As TLS protocols find their way into embedded environments, we consider the security and robustness of implementations of these protocols specifically in the light of the peculiarities of embedded systems. We present an approach for systematically checking the security and robustness of such implementations using fuzzing techniques and differential testing. In spite of its origin in testing TLS implementations we expect our approach to likewise be applicable to implementations of other cryptographic protocols with moderate efforts.
Short-term load forecasting (STLF) has been playing a key role in the electricity sector for several decades, due to the need for aligning energy generation with the demand and the financial risk connected with forecasting errors. Following the top-down approach, forecasts are calculated for aggregated load profiles, meaning the sum of singular loads from consumers belonging to a balancing group. Due to the emerging flexible loads, there is an increasing relevance for STLF of individual factories. These load profiles are typically more stochastic compared to aggregated ones, which imposes new requirements to forecasting methods and tools with a bottom-up approach. The increasing digitalization in industry with enhanced data availability as well as smart metering are enablers for improved load forecasts. There is a need for STLF tools processing live data with a high temporal resolution in the minute range. Furthermore, behin-the-meter (BTM) data from various sources like submetering and production planning data should be integrated in the models. In this case, STLF is becoming a big data problem so that machine learning (ML) methods are required. The research project “GaIN” investigates the improvement of the STLF quality of an energy utility using BTM data and innovative ML models. This paper describes the project scope, proposes a detailed definition for a benchmark and evaluates the readiness of existing STLF methods to fulfil the described requirements as a reviewing paper.
The review highlights that recent STLF investigations focus on ML methods. Especially hybrid models gain more and more importance. ML can outperform classical methods in terms of automation degree and forecasting accuracy. Nevertheless, the potential for improving forecasting accuracy by the use of ML models depends on the underlying data and the types of input variables. The described methods in the analyzed publications only partially fulfil the tool requirements for STLF on company level. There is still a need to develop suitable ML methods to integrate the expanded data base in order to improve load forecasts on company level.
Colored glass products with various printing technologies are becoming more important in industry. The aim is to achieve individual solution in a very short delivery time. Conventional thermal treatment of burning glasses in oven for tempered color printing has predominant issues with high time consumption, energy consumption and manufacturing cost. It requires alternative process development.
This paper proposes laser process to overcome issues in conventional treatment with the latest results of tempering colored glass. Samples have been analyzed with the scanning electron microscope (SEM). Two different laser systems have been applied and the glass has been printed with black paste.
One of the challenges for autonomous driving in general is to detect objects in the car's camera images. In the Audi Autonomous Driving Cup (AADC), among those objects are other cars, adult and child pedestrians and emergency vehicle lighting. We show that with recent deep learning networks we are able to detect these objects reliably on the limited Hardware of the model cars. Also, the same deep network is used to detect road features like mid lines, stop lines and even complete crossings. Best results are achieved using Faster R-CNN with Inception v2 showing an overall accuracy of 0.84 at 7 Hz.
The Paper presents the design and development of a blended learning concept for an engineering course in the field of color representation and display technologies. A suitable learning environment is crucial for the success of the teaching scenario. A mixture of theoretical lectures and hands-on activities with practical applications and experiments, combined with the advantages of modern digital media is the main topic of the paper. Blended learning describes the didactical change of attendance periods and online periods. The e-learning environment for the online period is designed toward an easy access and interaction. Present digital media extends the established teaching scenarios and enables the presentation of videos, animations and augmented reality (AR). Visualizations are effective tools to impart learning contents with lasting effect. The preparation and evaluation of the theoretical lectures and the hands-on activities are stimulated and affects positively the attendance periods. The tasks and experiments require the students to work independently and to develop individual solution strategies. This engages and motivates the students, deepens the knowledge. The authors will present their experience with the implemented blended learning scenario in this field of optics and photonics. All aspects of the learning environment will be introduced.
Monitors are in the center of media productions and hold an important function as the main visual interface. Tablets and smartphones are becoming more and more important work tools in the media industry. As an extension to our lecture contents an intensive discussion of different display technologies and its applications is taking place now. The established LCD (Liquid Crystal Display) technology and the promising OLED (Organic Light Emitting Diode) technology are in the focus.
The classic LCD is currently the most important display technology. The paper will present how the students should develop sense for display technologies besides the theoretical scientific basics. The workshop focuses increasingly on the technical aspects of the display technology and has the goal of deepening the students understanding of the functionality by building simple Liquid Crystal Displays by themselves.
The authors will present their experience in the field of display technologies. A mixture of theoretical and practical lectures has the goal of a deeper understanding in the field of digital color representation and display technologies. The design and development of a suitable learning environment with the required infrastructure is crucial. The main focus of this paper is on the hands-on optics workshop “Liquid Crystal Display in the do-it-yourself”.
VR-based implementation of interactive laboratory experiments in optics and photonics education
(2022)
Within the framework of a developed blended learning concept, a lot of experience has already been gained with a mixture of theoretical lectures and hands-on activities, combined with the advantages of modern digital media. Here, visualizations using videos, animations and augmented reality have proven to be effective tools to convey learning content in a sustainable way. In the next step, ideas and concepts were developed to implement hands-on laboratory experiments in a virtual environment. The main focus is on the realization of virtual experiments and environments that give the students a deep insight into selected subfields of optics and photonics.
The authors explain a developed concept for research-oriented education in optics and photonics. It is presented which goals are to be achieved, which strategies have been developed and how these can be implemented in a blended learning scenario. The goal of our education is the best possible qualification of the students on the basis of a strong scientific and research-oriented education, which also includes the acquisition of important interdisciplinary competences. All phases of a research process are to be mapped in the learning process and offer students an insight into current research topics in optics and photonics.
Redesigning a curriculum for teaching media technology is a major challenge. Up-to-date teaching and learning concepts are necessary that meet the constant technological progress and prepare students specifically for their professional life. Teaching and studying should be characterized by a student-oriented teaching and learning culture. In order to achieve this goal, consistent evaluation is essential. The aim of the evaluation concept presented here is to generate structured information regarding the quality of content-related, didactic and organizational aspects of teaching. The exchange of opinions between students and lecturers should be encouraged in order to continuously improve the teaching and learning processes.
Cardiac resynchronization therapy (CRT) with biventricular pacing is an established therapy for heart failure (HF) patients (P) with ventricular desynchronization and reduced left ventricular (LV) ejection fraction. The aim of this study was to evaluate electrical right atrial (RA), left atrial (LA), right ventricular (RV) and LV conduction delay with novel telemetric signal averaging electrocardiography (SAECG) in implantable cardioverter defibrillator (ICD) P to better select P for CRT and to improve hemodynamics in cardiac pacing.
Methods: ICD-P (n=8, age 70.8 ± 9.0 years; 2 females, 6 males) with VVI-ICD (n=4), DDD-ICD (n=3) and CRT-ICD (n=1) (Medtronic, Inc., Minneapolis, MN, USA) were analysed with telemetric ECG recording by Medronic programmer 2090, ECG cable 2090AB, PCSU1000 oscilloscope with Pc-Lab2000 software (Velleman®) and novel National Intruments LabView SAECG software.
Results: Electrical RA conduction delay (RACD) was measured between onset and offset of RA deflection in the RAECG. Interatrial conduction delay (IACD) was measured between onset of RA deflection and onset of far-field LA deflection in the RAECG. Interventricular conduction delay (IVCD) was measured between onset of RV deflection in the RVECG and onset of LV deflection in the LVECG. Telemetric SAECG recording was possible in all ICD-P with a mean of 11.7 ± 4.4 SAECG heart beats, 97.6 ± 33.7 ms QRS duration, 81.5 ± 44.6 ms RACD, 62.8 ± 28.4 ms RV conduction delay, 143.7 ± 71.4 ms right cardiac AV delay, 41.5 ms LA conduction delay, 101.6 ms LV conduction delay, 176.8 ms left cardiac AV delay, 53.6 ms IACD and 93 ms IVCD.
Conclusions: Determination of RA, LA, RV and LV conduction delay, IACD, IVCD, right and left cardiac AV delay by telemetric SAECG recording using LabView SAECG technique may be useful parameters of atrial and ventricular desynchronization to improve P selection for CRT and hemodynamics in cardiac pacing.
Spectral analysis of signal averaging electrocardiography in atrial and ventricular tachyarrhythmias
(2017)
Background: Targeting complex fractionated atrial electrograms detected by automated algorithms during ablation of persistent atrial fibrillation has produced conflicting outcomes in previous electrophysiological studies. The aim of the investigation was to evaluate atrial and ventricular high frequency fractionated electrical signals with signal averaging technique.
Methods: Signal averaging electrocardiography (ECG) allows high resolution ECG technique to eliminate interference noise signals in the recorded ECG. The algorithm uses automatic ECG trigger function for signal averaged transthoracic, transesophageal and intracardiac ECG signals with novel LabVIEW software (National Instruments, Austin, Texas, USA). For spectral analysis we used fast fourier transformation in combination with spectro-temporal mapping and wavelet transformation for evaluation of detailed information about the frequency and intensity of high frequency atrial and ventricular signals.
Results: Spectral-temporal mapping and wavelet transformation of the signal averaged ECG allowed the evaluation of high frequency fractionated atrial signals in patients with atrial fibrillation and high frequency ventricular signals in patients with ventricular tachycardia. The analysis in the time domain evaluated fractionated atrial signals at the end of the signal averaged P-wave and fractionated ventricular signals at the end of the QRS complex. The analysis in the frequency domain evaluated high frequency fractionated atrial signals during the P-wave and high frequency fractionated ventricular signals during QRS complex. The combination of analysis in the time and frequency domain allowed the evaluation of fractionated signals during atrial and ventricular conduction.
Conclusions: Spectral analysis of signal averaging electrocardiography with novel LabVIEW software can utilized to evaluate atrial and ventricular conduction delays in patients with atrial fibrillation and ventricular tachycardia. Complex fractionated atrial electrograms may be useful parameters to evaluate electrical cardiac arrhythmogenic signals in atrial fibrillation ablation.
In public transportation, the motor pool often consists of various different vehicles bought over a duration of many years. Sometimes, they even differ within one batch bought at the same time. This poses a considerable challenge in the storage and allocation of spare parts, especially in the event of damage to a vehicle. Correctly assigning these parts before the vehicle reaches the workshop could significantly reduce both the downtime and, therefore, the actual costs for companies. In order to achieve this, the current software uses a simple probability calculation. To improve the performance, the data of specific companies was analysed, preprocessed and used with several modelling techniques to classify and, therefore, predict the spare parts to be used in the event of a faulty vehicle. We summarize our experience running through the steps of the Cross Industry Standard Process for Data Mining and compare the performance to the previously used probability. Gradient Boosting Trees turned out to be the best modeling technique for this special case.
This paper describes the use of the single-linkage hierarchical clustering method in outlier detection for manufactured metal work pieces. The main goal of the study is to group defects that occur 5 mm into a work piece from the edge, i.e., the border of the metal work piece. The goal is to remove defects outside the area of interest as outliers. According to the assumptions made for the performance criteria, the single-linkage method has achieved better results compared to other agglomeration methods.
In the railway technical centers, scheduling the maintenance activities is a very complex task, it consists in ordering, in the time, all the maintenance operations on the workstations, while respecting the number of resources, precedence constraints, and the workstations' availabilities. Currently, this process is not completely automatic. For improving this situation, this paper presents a mathematical model for the maintenance activities scheduling in the case of railway remanufacturing systems. The studied problem is modeled as a flexible job-shop, with the possibility for a job to be executed several times on a stage. MILP formulation is implemented with the Makespan as an objective, representing the time for remanufacturing the train. The aim is to create a generic model for optimizing the planning of the maintenance activities and improving the performance of the railway technical centers. At last, numerical results are presented, discussing the impact of the instances size on the computing time to solve the described problem.
In contrast to conventional aortic valve replacement, the Transcatheter Aortic Valve Implantation (TAVI) is a new highly specialist alternative to surgical valve replacement for patients with symptomatic severe aortic stenosis and high operative risk. The procedure was performed in a minimally invasive way and was introduced at the university heart centre, Freiburg – Bad Krozingen in 2008. The results have been getting better and better over the years. The aim of the investigation is the analysis of electrocardiogram conduction time and the electrocardiography changes recorded hours and days after the procedure depending on artificial heart valve models, which may lead to pacemaker implantation, even the analysis of the effectiveness of treatment.
Transcatheter aortiv valve implantation is a new safe strategy treatment for patients with symptomatic severe aortic stenosis and high operative risk. The aim of the study was to compare the pre-and post- muiscatheter aortiv valve implantation procedures to determine the atrioventricuktr conduction time as a potential predictor of permanent pacemaker therapy requirement after transcatheter aortiv valve implantation. The transcatheter aortiv valve implantation patients were divided into groups without pacemaker and with dual or single chamber pacemEtker with diffent atrioventrieular conduction time disturbance before and after transcatheter aortiv valve implantation. In heart failure, patients without permanent pacemaker therapy after transcatheter aortiv valve implantation, atrioventricular conduction time was prolonged after transcatheter aortiv valve implantation. In patients with permanent dual chamber pacemaker therapy after transcatheter aortiv valve implantation, atrioventricular conduction time was normalised with dual chaniber atrioventrieuku pacing mode. Atrioventricular conduction time may be a useful parameter to evaluate the risk of post-procedural atrioventricular conduction block and permanent pacemaker therapy in transcatheter north, valve implantation patients.
Network landscape of recent time contains many different network technologies, a wide range of end-devices with a large scale of capabilities and power, and an immense quantity of information and data represented in different formats. Research on 3D imaging, virtual reality and holographic techniques will result in new user interfaces (UI) for mobile devices, will increase their diversity and variety. In this paper software architecture has been proposed to establish device and content format independent communication including 3D imaging and virtual reality data as content. As experimental validation the concept is implemented in collaborative Language Learning Game (LLG), which is a learning tool for language acquisition.
Nowadays, it is assumed of many applications, companies and parts of the society to be always available online. However, according to [Times, Oct, 31 2011], 73% of the world population do not use the internet and thus aren't “online” at all. The most common reasons for not being “online” are expensive personal computer equipment and high costs for data connections, especially in developing countries that comprise most of the world’s population (e.g. parts of Africa, Asia, Central and South America). However it seems that these countries are leap-frogging the “PC and landline” age and moving directly to the “mobile” age. Decreasing prices for smart phones with internet connectivity and PC-like operating systems make it more affordable for these parts of the world population to join the “always-online” community. Storing learning content in a way accessible to everyone, including mobile and smart phones, seems therefore to be beneficial. This way, learning content can be accessed by personal computers as well as by mobile and smart phones and thus be accessible for a big range of devices and users. A new trend in the Internet technologies is to go to “the cloud”. This paper discusses the changes, challenges and risks of storing learning content in the “cloud”. The experiences were gathered during the evaluation of the necessary changes in order to make our solutions and systems “cloud-ready”.
Smoothie: a solution for device and content independent applications including 3D imaging as content
(2014)
Network landscape of recent time contains many different network technologies, a wide range of end-devices with a large scale of capabilities and power, and an immense quantity of information represented in different data formats. Research on 3D imaging, virtual reality and holographic techniques will result in new user interfaces (UI) for mobile devices and will increase their diversity and variety. A lot of effort is being made in order to establish open, scalable and seamless integration of various technologies and content presentation for different devices including those that are mobile, considering the individual situation of the end user. Till today the research is going on in different parts of the world but the task is not completed yet. The goal of this research work is to find a way to solve the above stated problems by investigating system architectures to provide unconstrained, continuous and personalized access to the content and interactive applications everywhere and at anytime with different devices. As a Solution of the problem considered, a new architecture named “Smoothie” is proposed.
The concept of m-learning which differs from other forms of e-learning covers a wide range of possibilities opened up by the convergence of new mobile technologies, wireless communication structure and distance learning development. This process of converging has launched some new goals to support m-learning where heterogeneity of devices, their operating systems (Linux, Windows, Symbian, Android etc) and supported markup languages (WML, XHTML etc), adaptive content, preferences or characteristics of user have become some of the major problems to be solved. To facilitate the learning process even more and to establish literally anytime anywhere learning, learning material/content should be available to the user always even if the user is in offline. Multiple devices used by the same user should also be synchronized among themselves and with server to provide updated learning content and to give a freedom to the user to choose any device as per his/her convenience. In this paper software architecture has been proposed to solve these problems and has been implemented by using a multidimensional flashcard learning system which synchronizes among all the devices that are being used by the user.
Today's network landscape consists of quite different network technologies, wide range of end-devices with large scale of capabilities and power, and immense quantity of information and data represented in different formats. Research on 3D imaging, virtual reality and holographic techniques will result in new user interfaces (UI) for mobile devices and will increase their diversity and variety. A lot of efforts are being done in order to establish open, scalable and seamless integration of various technologies and content presentation for different devices including mobile considering individual situation of the end user. This is very difficult because various kinds of devices used by different users or in different times/parallel by the same user which are not predictable and have to be recognized by the system in order to identify device capabilities. Not only the devices but also Content and User Interfaces are big issues because they could include different kinds of data format like text, image, audio, video, 3D Virtual Reality data and other upcoming formats. A very suitable and useful example of the use of such a system is mobile learning because of the large amount of varying devices with significantly different features and functionalities. This is true not only to support different learners, e.g. all learners within one learning community, but also to support the same learner using different equipment parallel and/or at different times. Those applications may be significantly enhanced by including virtual reality content presentation. Whatever the purposes are, it is impossible to develop and adapt content for all kind of devices including mobiles individually due to different capabilities of the devices, cost issues and author‘s requirement. A solution should be found to enable the automation of the content adaptation process.
This paper describes a thorough analysis of using PPO to learn kick behaviors with simulated NAO robots in the simspark environment. The analysis includes an investigation of the influence of PPO hyperparameters, network size, training setups and performance in real games. We believe to improve the state of the art mainly in four points: first, the kicks are learned with a toed version of the NAO robot, second, we improve the reliability with respect to kickable area and avoidance of falls, third, the kick can be parameterized with desired distance and direction as input to the deep network and fourth, the approach allows to integrate the learned behavior seamlessly into soccer games. The result is a significant improvement of the general level of play.
The Thread protocol is a recent development based on 6LoWPAN (IPv6 over IEEE 802.15.4), but with extensions regarding a more media independent approach, which – additionally – also promises true interoperability. To evaluate and analyse the operation of a Thread network a given open source 6LoWPAN stack for embedded devices (emb::6) has been extended in order to comply with the Thread specification. The implementation covers Mesh Link Establishment (MLE) and network layer functionality as well as 6LoWPAN mesh under routing mechanism based on MAC short addresses. The development has been verified on a virtualization platform and allows dynamical establishment of network topologies based on Thread's partitioning algorithm.
OPC UA (Open Platform Communications Unified Architecture) is already a well-known concept used widely in the automation industry. In the area of factory automation, OPC UA models the underlying field devices such as sensors and actuators in an OPC UA server to allow connecting OPC UA clients to access device-specific information via a standardized information model. One of the requirements of the OPC UA server to represent field device data using its information model is to have advanced knowledge about the properties of the field devices in the form of device descriptions. The international standard IEC 61804 specifies EDDL (Electronic Device Description Language) as a generic language for describing the properties of field devices. In this paper, the authors describe a possibility to dynamically map and integrate field device descriptions based on EDDL into OPCUA.
The IEEE802.11p standard describes a protocol for car-to-X and mainly for car-to-car-communication. It has found its place in hardware and firmware implementations and is currently tested in various field tests. In the research project Ko-TAG, which is part of the research initiative Ko-FAS, cooperative sensor technology is developed and its benefit for traffic safety applications is evaluated. A secondary radar principle based on communication signals enables localization of objects with simultaneous data transmission. It mainly concentrates on the detection of pedestrians and other vulnerable road users (VRU), but also supports pre crash safety applications. The Ko-TAG proposal enriches the current IEEE802.11p real-time characteristics needed for precise time-of-flight real-time localization. This contribution describes the development of a subsystem, which extends the functionality of IEEE802.11p and fits into the regulatory schemes. It discusses the approach for definition and verification of the protocol design, while maintaining the close coexistence with existing IEEE802.11p subsystems. System simulations were performed and hardware was implemented. The next step will be field measurements to verify the simulation results.
Legacy industrial communication protocols are proved robust and functional. During the last decades, the industry has invented completely new or advanced versions of the legacy communication solutions. However, even with the high adoption rate of these new solutions, still the majority industry applications run on legacy, mostly fieldbus related technologies. Profibus is one of those technologies that still keep on growing in the market, albeit a slow in market growth in recent years. A retrofit technology that would enable these technologies to connect to the Internet of Things, utilize the ever growing potential of data analysis, predictive maintenance or cloud-based application, while at the same time not changing a running system is fundamental.
The research project Ko-TAG [2], as part of the research initiative Ko-FAS [1], funded by the German Ministry of Economics and Technologies (BMWi), deals with the development of a wireless cooperative sensor system that shall pro-vide a benefit to current driver assistance systems (DAS) and traffic safety applications (TSA). The system’s primary function is the localization of vulnerable road users (VRU) e.g. pedestrians and powered two-wheelers, using communication signals, but can also serve as pre-crash (surround) safety system among vehicles. The main difference of this project, compared to previous ones that dealt with this topic, e.g. the AMULETT project, is an underlying FPGA based Hardware-Software co-design. The platform drives a real-time capable communication protocol that enables highly scalable network topologies fulfilling the hard real-time requirements of the single localization processes. Additionally it allows the exchange of further data (e.g. sensor data) to support the accident pre-diction process and the channel arbitration, and thus supports true cooperative sensing. This paper gives an overview of the project’s current system design as well as of the implementations of the key HDL entities supporting the software parts of the communication protocol. Furthermore, an approach for the dynamic reconfiguration of the devices is described, which provides several topology setups using a single PCB design.
Energy and environment continue to be major issues of human mankind. This holds true on the regional, the national, and the global level. And it is one of the problems, where engineers and scientists in conjunction with political will and people's awareness, can find new approaches and solutions to save the natural resources and to make their use more efficient.
Institute of Reliable Embedded Systems and Communication Electronics, Offenburg University of Applied Sciences, Germany has developed an automated testing environment, Automated Physical TestBeds (APTB), for analyzing the performance of wireless systems and its supporting protocols. Wireless physical networking nodes can connect to this APTB and the antenna output of this attaches with the RF waveguides. To model the RF environment this RF waveguides then establish wired connection among RF elements like splitters, attenuators and switches. In such kind of set up it’s well possible to vary the path characteristics by altering the attenuators and switches. The major advantage of using APTB is the possibility of isolated, well controlled, repeatable test environment in various conditions to run statistical analysis and even to execute regression tests. This paper provides an overview of the design and implementation of APTB, demonstrates its ability to automate test cases, and its efficiency.
Automatic Meter Reading (AMR) is a major enabler for the upcoming smart grid. Potentially, it will be one of the first really large-scale M2M-communication solutions for sensor applications.
To date, the definition of the standardized communication stacks for Local Metrological Network (LMN) in AMR is still ongoing. This holds true both for ZigBee Smart Energy Profile and for Wireless M-Bus according to EN 13757. During this process, there is the necessity for flexible, albeit optimized solutions, which support the different existing and upcoming versions of the communication protocols. In the case of Wireless M-Bus, the major contender for European and possibly Asian installations, this is valid not only for the different operation modes (C-, N-, P-, Q-, R-, S-, and T-modes), which work in different frequencies (i.e. 868 MHz, 433 MHz, and 169 MHz) but also for the application layer, where additional bodies, like EN137575, Open Metering System (OMS) Group, or national bodies follow their approaches.
This contribution describes requirements, design techniques and experiences from the development of highly efficient Wireless M-Bus protocol stacks with support of good flexibility and portability between microcontroller platforms and RF-transceivers. The presented approach is not limited to the use of modern software engineering design processes, as such, but also includes essential additional features like testing or simulation, as well as tools for commissioning and monitoring.
The Internet of Things (IoT), ubiquitous computing and ubiquitous connectivity, Cyber Physical Systems (CPS), ambient intelligence, Machine-to-Machine communication (M2M) or Car-to-Car (C2C)-communication, smart metering, smart grid, telematics, telecare, telehealth – there are many buzzwords around current developments related to the Internet.
This contribution gives an overview on such IoT-applications, as they are already used today to improve the availability of information, increase efficiency, push system limits and extend the value chain. At a closer look, the economic and technical development can be separated into different phases. It is interesting that we are currently at the threshold to a new phase, with decentralized and cooperative communication and control nodes as cornerstones. Thus, embedded systems and their connectivity are in the middle of the scene.
This recent development is described along with some example projects from the author’s team which are used in industrial automation, energy supply and distribution (home automation and smart metering), traffic engineering (cooperative driver assistance systems), and in telehealth and telecare.
The services sector is also called “tertiary sector” and has become increasingly important in the last few decades. The process of this occurring structural change is characterized by a significant increase in employment in the services sector. On the other hand, the former economic importance in traditional areas, such as agriculture and forestry, as well as manufacturing, is declining.
To have an overview it is important to look and to analyse the different research studies.
The increased complexity and dynamics of the business environment and the problems of a young organization are treated extensively in the literature [Bleicher 2002, p. 34; Malik 1996, p. 86; Ulrich/Probst 1990, p. 23ff; Gomez 1999, p. 65]. This complexity is the core of the leadership role in a company [Malik 1996, p. 184]. STÜTTGEN (1999, p. 8) states in this regard: "A satisfactory answer to the question, according to which patterns complex social systems are to be designed to meet the proliferating environmental complexity facing an adequate intrinsic complexity of the company can be, in this context, a critical success factor for management." How can young SMEs solve strategic problems with service engineering in their companies?
A 2002 study on corporate planning of the top German companies measured by turnover found that of the surveyed large companies, 80 percent have carried out strategic planning and 90 percent have operational planning in place [Link/Orbán 2002, pp. 11]. The human and material costs of designing and implementing the strategic planning can be very high. Many SMEs do not have the necessary capacities to do this. To obtain a comprehensive overview, this chapter examines the existing studies and findings for young SMEs. Many of the studies reviewed and the following publications relate to SMEs as defined by the EU. This analysis also includes established SMEs and medium-sized enterprises.
To demonstrate how deep learning can be applied to industrial applications with limited training data, deep learning methodologies are used in three different applications. In this paper, we perform unsupervised deep learning utilizing variational autoencoders and demonstrate that federated learning is a communication efficient concept for machine learning that protects data privacy. As an example, variational autoencoders are utilized to cluster and visualize data from a microelectromechanical systems foundry. Federated learning is used in a predictive maintenance scenario using the C-MAPSS dataset.
In this contribution, we propose an system setup for the detection andclassification of objects in autonomous driving applications. The recognition algo-rithm is based upon deep neural networks, operating in the 2D image domain. Theresults are combined with data of a stereo camera system to finally incorporatethe 3D object information into our mapping framework. The detection systemis locally running upon the onboard CPU of the vehicle. Several network archi-tectures are implemented and evaluated with respect to accuracy and run-timedemands for the given camera and hardware setup.
Transthoracic impedance cardiography (ICG) is a non-invasive method for determination of hemodynamic parameters. The basic principle of transthoracic ICG is the measurement of electrical conductivity of the thorax over the time. The aim of the study was the analysis of hemodynamic parameters from healthy individuals and the evaluation of various hemodynamic monitoring devices. Fourteen men (mean age 25 ± 4.59 years) and twelve women (mean age 24 ± 3.5 years) were measured during the cardiovascular engineering laboratory at Offenburg University of Applied Sciences, Offenburg, Germany. The ICG recordings were measured with the devices CardioScreen 1000, CardioScreen 2000 and TensoScreen with the corresponding Software Cardiovascular Lab 2.5 (Medis Medizinische Messtechnik GmbH, Illmenau, Germany). In order to create identical frame conditions, all measurements were recorded in the same position and for the same duration. Various positions were simulated from horizontal lying position to vertical standing position. Altogether, more than 30 hemodynamic parameters were measured.
This paper presents a method for supporting the application of Additive Tooling (AT)-based validation environments in integrated product development. Based on a case study, relevant process steps, activities and possible barriers in the realisation of an injection-moulded product are identified and analysed. The aim of the method is to support the target-oriented application of Additive Tooling to obtain physical prototypes at an early stage and to shorten validation cycles.
For the RoboCup Soccer AdultSize League the humanoid robot Sweaty uses a single fully convolutional neural network to detect and localize the ball, opponents and other features on the field of play. This neural network can be trained from scratch in a few hours and is able to perform in real-time within the constraints of computational resources available on the robot. The time it takes to precess an image is approximately 11 ms. Balls and goal posts are recalled in 99 % of all cases (94.5 % for all objects) accompanied by a false detection rate of 1.2 % (5.2 % for all). The object detection and localization helped Sweaty to become finalist for the RoboCup 2017 in Nagoya.
Spinal cord stimulation (SCS) is the most commonly used technique of neurostimulation. It involves the stimulation of the spinal cord and is therefore used to treat chronic pain. The existing esophageal catheters are used for temperature monitoring during an electrophysiology study with ablation and transesophageal echocardiography. The aim of the study was to model the spine and new esophageal electrodes for the transesophageal electrical pacing of the spinal cord, and to integrate them in the Offenburg heart rhythm model for the static and dynamic simulation of transesophageal neurostimulation. The modeling and simulation were both performed with the electromagnetic and thermal simulation software CST (Computer Simulation Technology, Darmstadt). Two new esophageal catheters were modelled as well as a thoracic spine based on the dimensions of a human skeleton. The simulation of directed transesophageal neurostimulation is performed using the esophageal balloon catheter with an electric pacing potential of 5 V and a trapezoidal signal. A potential of 4.33 V can be measured directly at the electrode, 3.71 V in the myocardium at a depth of 2 mm, 2.68 V in the thoracic vertebra at a depth of 10 mm, 2.1 V in the thoracic vertebra at a depth of 50 mm and 2.09 V in the spinal cord at a depth of 70 mm. The relation between the voltage delivered to the electrodes and the voltage applied to the spinal cord is linear. Virtual heart rhythm and catheter models as well as the simulation of electrical pacing fields and electrical sensing fields allow the static and dynamic simulation of directed transesophageal electrical pacing of the spinal cord. The 3D simulation of the electrical sensing and pacing fields may be used to optimize transesophageal neurostimulation.
An Empirical Study of Explainable AI Techniques on Deep Learning Models For Time Series Tasks
(2021)
Decision explanations of machine learning black-box models are often generated by applying Explainable AI (XAI) techniques. However, many proposed XAI methods produce unverified outputs. Evaluation and verification are usually achieved with a visual interpretation by humans on individual images or text. In this preregistration, we propose an empirical study and benchmark framework to apply attribution methods for neural networks developed for images and text data on time series. We present a methodology to automatically evaluate and rank attribution techniques on time series using perturbation methods to identify reliable approaches.
Robust scheduling problem is a major decision problem that is addressed in the literature, especially for remanufacturing systems; this problem is complex because of the high uncertainty and complex constraints involved. Generally, the existing approaches are dedicated to specific processes and do not enable the quick and efficient generation and evaluation of schedules. With the emergence of the Industry 4.0 paradigm, data availability is now considered an opportunity to facilitate the decision-making process. In this study, a data-driven decisionmaking process is proposed to treat the robust scheduling problem of remanufacturing systems in uncertain environments. In particular, this process generates simulation models based on a data-driven modeling approach. A robustness evaluation approach is proposed to answer several decision questions. An application of the decision process in an industrial case of a remanufacturing system is presented herein, illustrating the impact of robustness evaluation results on real-life decisions.
The humanoid Sweaty was the finalist in this year’s robocup soccer championship(adult size). For the optimization of the gait and the stability, data concerning forces and torques in the ankle joints would be helpful. In the following paper the development of a six-axis force and torque sensor for the humanoid robot Sweaty is described. Since commercial sensors do not meet the demands for the sensors in Sweatys ankle joints, a new sensor was developed. As a measuring devices we used strain gauges and custom electronics based on an acam PS09. The geometry was analyzed with the FEM program ANSYS to get optimal dimensions for the measuring beams. In addition ANSYS was used to optimize the position for the strain gauges on the beam.
One of the challenges in humanoid robotics is motion control. Interacting with humans requires impedance control algorithms, as well as tackling the problem of the closed kinematic chains which occur when both feet touch the ground. However, pure impedance control for totally autonomous robots is difficult to realize, as this algorithm needs very precise sensors for force and speed of the actuated parts, as well as very high sampling rates for the controller input signals. Both requirements lead to a complex and heavy weight design, which makes up for heavy machines unusable in RoboCup Soccer competitions.
A lightweight motor controller was developed that can be used for admittance and impedance control as well as for model predictive control algorithms to further improve the gait of the robot.
Autonomous humanoid robots require light weight, high torque and high speed actuators to be able to walk and run. For conventional gears with a fixed gear ratio the product of torque and velocity is constant. On the other hand desired motions require maximum torque and speed. In this paper it is shown that with a variable gear ratio it is possible to vary the relation between torque and velocity. This is achieved by introducing systems of rods and levers to move the joints of our humanoid robot ”Sweaty II”. On the basis of a variable gear ratio low speed and high torque can be achieved for those joint angles, which require this motion mode, whereas high speed and low torque can be realized for those joint angles, where it is favorable for the desired motion.
Heart rhythm model and simulation of electrophysiological studies and high-frequency ablations
(2017)
Background: Target of the study was to create an accurate anatomic CAD heart rhythm model, and to show its usefulness for cardiac electrophysiological studies and high-frequency ablations. The method is more careful for the patients’ health and has the potential to replace clinical studies due to its high efficiency regarding time and costs.
Methods: All natural heart components of the new HRM were based on MRI records, which guaranteed electronic functionality. The software CST was used for the construction, while CST’s material library assured genuine tissue properties. It should be applicable to simulate different heart rhythm diseases as well as various diffusions of electromagnetic fields, caused by electrophysiological conduction, inside the heart tissue.
Results: It was achievable to simulate sinus rhythm and fourteen different heart rhythm disturbance with different atrial and ventricular conduction delays. The simulated biological excitation of healthy and sick HRM were plotted by simulated electrodes of four polar right atrial catheter, six polar His bundle catheter, ten polar coronary sinus catheter, four polar ablation catheter and eight polar transesophageal left cardiac catheter. Accordingly, six variables were rebuilt and inserted into the anatomic HRM in order to establish heart catheters for ECG monitoring and HF ablation. The HF ablation catheters made it possible to simulate various types of heart rhythm disturbance ablations with different HF ablation catheters and also showed a functional visualisation of tissue heating. The use of tetrahedral meshing HRM made it attainable to store the results faster accompanied by a higher degree of space saving. The smart meshing function reduced unnecessary high resolutions for coarse structures.
Conclusions: The new HRM for EPS simulation may be additional useful for simulation of heart rhythm disturbance, cardiac pacing, HF ablation and for locating and identification of complex fractioned signals within the atrium during atrial fibrillation HF ablation.
Heart rhythm model and simulation of electrophysiological studies and high-frequency ablations
(2017)
Background: The simulation of complex cardiologic structures has the potential to replace clinical studies due to its high efficiency regarding time and costs. Furthermore, the method is more careful for the patients’ health than the conventional ways. The aim of the study was to create an anatomic CAD heart rhythm model (HRM) as accurate as possible, and to show its usefulness for cardiac electrophysiological studies (EPS) and high-frequency (HF) ablations.
Methods: All natural heart components of the new HRM were based on MRI records, which guaranteed electronic functionality. The software CST (Computer Simulation Technology, Darmstadt) was used for the construction, while CST’s material library assured genuine tissue properties. It should be applicable to simulate different heart rhythm diseases as well as various diffusions of electromagnetic fields, caused by electrophysiological conduction, inside the heart tissue.
Results: It was achievable to simulate normal sinus rhythm and fourteen different heart rhythm disturbance with different atrial and ventricular conduction delays. The simulated biological excitation of healthy and sick HRM were plotted by simulated electrodes of four polar right atrial catheter, six polar His bundle catheter, ten polar coronary sinus catheter, four polar ablation catheter and eight polar transesophageal left cardiac catheter (Fig.). Accordingly, six variables were rebuilt and inserted into the anatomic HRM in order to establish heart catheters for ECG monitoring and HF ablation. The HF ablation catheters made it possible to simulate various types of heart rhythm disturbance ablations with different HF ablation catheters and also showed a functional visualisation of tissue heating. The use of tetrahedral meshing HRM made it attainable to store the results faster accompanied by a higher degree of space saving. The smart meshing function reduced unnecessary high resolutions for coarse structures.
Conclusions: The new HRM for EPS simulation may be additional useful for simulation of heart rhythm disturbance, cardiac pacing, HF ablation and for locating and identification of complex fractioned signals within the atrium during atrial fibrillation HF ablation.
OVVL (the Open Weakness and Vulnerability Modeller) is a tool and methodology to support threat modeling in the early stages of the secure software development lifecycle. We provide an overview of OVVL (https://ovvl.org), its data model and browser-based UI. We equally provide a discussion of initial experiments on how identified threats in the design phase can be aligned with later activities in the software lifecycle (issue management and security testing).
Microscale trigeneration systems are highly flexible in their operation and thus offer the technical possibility for peak load shifting in building demand side management. However to harness their potential modern control methods such as model predictive control must be implemented for their optimal scheduling. In literature the need for experimental investigation of microscale trigeneration systems to identify typical characteristics of the components and their interactions has been identified. On a real-life setup control specific information of the components is collected and lessons learnt during commissioning of the equipment is shared. The data is analysed to draw the vital characteristics of the system and it will be used for creating models of the components that can be utilised for optimal control.
The transformation of the building energy sector to a highly efficient, clean, decentralised and intelligent system requires innovative technologies like microscale trigeneration and thermally activated building structures (TABS) to pave the way ahead. The combination of such technologies however presents a scientific and engineering challenge. Scientific challenge in terms of developing optimal thermo-electric load management strategies based on overall energy system analysis and an engineering challenge in terms of implementing these strategies through process planning and control. Initial literature research has pointed out the need for a multiperspective analysis in a real life laboratory environment. To this effect an investigation is proposed wherein an analytical model of a microscale trigeneration system integrated with TABS will be developed and compared with a real life test-rig corresponding to building management systems. Data from the experimental analysis will be used to develop control algorithms using model predictive control for achieving the thermal comfort of occupants in the most energy efficient and grid reactive manner. The scope of this work encompasses adsorption cooling based microscale trigeneration systems and their deployment in residential and light commercial buildings.
A coordinated operation of decentralised micro-scale hybrid energy systems within a locally managed network such as a district or neighbourhood will play a significant role in the sector-coupled energy grid of the future. A quantitative analysis of the effects of the primary energy factors, energy conversion efficiencies, load profiles, and control strategies on their energy-economic balance can aid in identifying important trends concerning their deployment within such a network. In this contribution, an analysis of the operational data from five energy laboratories in the trinational Upper-Rhine region is evaluated and a comparison to a conventional reference system is presented. Ten exemplary data-sets representing typical operation conditions for the laboratories in different seasons and the latest information on their national energy strategies are used to evaluate the primary energy consumption, CO2 emissions, and demand-related costs. Various conclusions on the ecologic and economic feasibility of hybrid building energy systems are drawn to provide a toe-hold to the engineering community in their planning and development.
We herein present a topology design method based on local optimality criteria which has been implemented in an open source Navier-Stokes solver for turbulent flows. Our method aims for the fast generation of geometry proposals in the early conceptual phase. To the best of our knowledge, this is the first local criteria approach utilizing a wall function turbulence model in order to consider turbulent flows. In order to allow for the growth as well as the shrinkage, or even the formation or disappearance of structural features, a topological approach is chosen. By introducing a volume fraction parameter, we distinguish between fluid and solid properties in each control volume. The fluid-solid interface is represented by an immersed boundary method using a piecewise linear surface reconstruction.
To achieve Germany's climate targets, the industrial sector, among others, must be transformed. The decarbonization of industry through the electrification of heating processes is a promising option. In order to investigate this transformation in energy system models, high-resolution temporal demand profiles of the heat and electricity applications for different industries are required. This paper presents a method for generating synthetic electricity and heat load profiles for 14 industry types. Using this methodology, annual profiles with a 15-minute resolution can be generated for both energy demands. First, daily profiles for the electricity demand were generated for 4 different production days. These daily profiles are additionally subdivided into eight end-use application categories. Finally, white noise is applied to the profile of the mechanical drives. The heat profile is similar to the electrical but is subdivided into four temperature ranges and the two applications hot water and space heating. The space heating application is additionally adjusted to the average monthly outdoor temperature. Both time series were generated for the analysis of an electrification of industrial heat application in energy system modelling.
In recent years, the topic of embedded machine learning has become very popular in AI research. With the help of various compression techniques such as pruning, quantization and others compression techniques, it became possible to run neural networks on embedded devices. These techniques have opened up a whole new application area for machine learning. They range from smart products such as voice assistants to smart sensors that are needed in robotics. Despite the achievements in embedded machine learning, efficient algorithms for training neural networks in constrained domains are still lacking. Training on embedded devices will open up further fields of applications. Efficient training algorithms would enable federated learning on embedded devices, in which the data remains where it was collected, or retraining of neural networks in different domains. In this paper, we summarize techniques that make training on embedded devices possible. We first describe the need and requirements for such algorithms. Then we examine existing techniques that address training in resource-constrained environments as well as techniques that are also suitable for training on embedded devices, such as incremental learning. At the end, we also discuss which problems and open questions still need to be solved in these areas.
Training deep neural networks using backpropagation is very memory and computationally intensive. This makes it difficult to run on-device learning or fine-tune neural networks on tiny, embedded devices such as low-power micro-controller units (MCUs). Sparse backpropagation algorithms try to reduce the computational load of on-device learning by training only a subset of the weights and biases. Existing approaches use a static number of weights to train. A poor choice of this so-called backpropagation ratio limits either the computational gain or can lead to severe accuracy losses. In this paper we present TinyProp, the first sparse backpropagation method that dynamically adapts the back-propagation ratio during on-device training for each training step. TinyProp induces a small calculation overhead to sort the elements of the gradient, which does not significantly impact the computational gains. TinyProp works particularly well on fine-tuning trained networks on MCUs, which is a typical use case for embedded applications. For typical datasets from three datasets MNIST, DCASE2020 and CIFAR10, we are 5 times faster compared to non-sparse training with an accuracy loss of on average 1%. On average, TinyProp is 2.9 times faster than existing, static sparse backpropagation algorithms and the accuracy loss is reduced on average by 6 % compared to a typical static setting of the back-propagation ratio.
Currently, QRS width and bundle branch block morphology are used as electrocardiographic guideline criterias to selectheart failure (HF) patients with interventricular desynchronization in sinus rhythm (SR) for cardiac resynchronisationtherapy (CRT). Nevertheless, up to 30% of these patients do not benefit from implantation of CRT systems. Esophagealleft ventricular electrogram (LVE) enables semi-invasive measurement of interventricular conduction delays (IVCD)even in patients with atrial fibrillation (AF). To routinely apply this method, a programmer based semi-invasiveautomatic quantification of IVCD should to be developed. Our aims were todefine interventricular conduction delaysby analyzing fractionated left ventricular (LV) deflections in the esophageal left ventricular electrogram of HF patientsin SR or AF.
In 66 HF patients (49 male,17 female, age 65 ± 10 years) a 5F TOslim electrode (Osypka AG, Germany) was perorallyapplied. Using BARD EP Lab, cardiac desynchronization was quantified as interval IVCD between onset of QRS insurface ECG and the investigator-determined onset of the left ventricular deflection in LVE. IVCD was compared withthe intervals between QRS onset and the first maximum (IVCDm1) and between QRS onset and the second maximum(IVCDm2) of the LV complex.
QRS of 173 ± 26 ms was linked with empirical IVCD of 75 ± 25 ms, at mean. First and second LV maximum could beascertained beyond doubt in all patients. Significant correlations of the p<0,01 level were found between IVCD and theIVCDm1 of 96 ± 28 ms as well as between IVCD and the IVCDm2 of 147 ± 31 ms, at mean. To standardize automatic measurement of interventricular conduction delays with respect to patients with fractionatedLV complexes, the first maximum of the LV deflection should be utilized to qualify the IVCD of HF patients with sinusrhythm and atrial fibrillation.
The efficient support of Hardwae-In-theLoop (HIL) in the design process of hardwaresoftware-co-designed systems is an ongoing challenge. This paper presents a network-based integration of hardware elements into the softwarebased image processing tool „ADTF“, based on a high-performance Gigabit Ethernet MAC and a highly-efficient TCP/IP-stack. The MAC has been designed in VHDL. It was verified in a SystemCsimulation environment and tested on several Altera FPGAs.
Effect of downhill running on biomechanical risk factors associated with iliotibial band syndrome
(2022)
The purpose of this study was to identify the influence of downhill running on biomechanical risk factors for iliotibial band syndrome. We conducted a 3D motion analysis of 22 females and males running on an instrumented treadmill at four different inclinations (0%, -5%, -10%, -15%) at a speed of 3.5 m/s. We found significant differences for biomechanical risk factors associated with iliotibial band syndrome. Peak knee flexion angle at initial ground contact (p < .001), peak knee adduction angle (p = .005), and iliotibial band strain (p < .001) systematically increased with increasing slope. Downhill running increases biomechanical risk factors for iliotibial band syndrome. Our results highlight the need to consider the individual running environment in assessing overuse injury risk in runners.
Components of rocket engines as actively cooled combustion chambers must withstand high pressure as well as severe and complex thermal transients. While the thermal transients result in temperature gradients and, thus, in constraint thermal strains, the pressure load induces mean stresses. To assess the mechanical behaviour of such components during design via finite-element calculations, constitutive models are necessary that describe the time- and temperature-dependent plasticity of the material appropriately.
Advanced models account for viscoplastic deformations including isotropic and kinematic hardening, recovery and ratcheting. However, the models contain a relatively large number of temperature-dependent material properties that must be determined on the basis of data of material tests. The determination of the properties is a non-trivial task because it is not clear which loading history must be applied in the tests for a certain material to obtain stable and robust (i.e. objective) material properties. Consequently, the determined properties are depending on the underlying loading history in the tests as well as on the experience and valuation of the person that determined the properties. This results in uncertainties during the assessment of the components that must be faced with conservative designs leading to negative consequences in terms of mass and costs.
It is the aim of this work funded by the European Space Agency ESA to derive a procedure to determine stable and robust material properties of an advanced viscoplastic constitutive model for aerospace materials. To this end, a special loading history is applied in isothermal material tests conducted with copper at different temperatures in the temperature range from 300 to 700 K. To determine the material properties and to assess stability and robustness methods for numerical optimization as well as analytical and statistical methods are used. The determined material properties are validated on the basis of results of thermomechanical material tests also conducted in the temperature range from 300 to 700 K.
The paper describes the methodology and experimental results for revealing similarities in thermal dependencies of biases of accelerometers and gyroscopes from 250 inertial MEMS chips (MPU-9250). Temperature profiles were measured on an experimental setup with a Peltier element for temperature control. Classification of temperature curves was carried out with machine learning approach.
A perfect sensor should not have thermal dependency at all. Thus, only sensors inside the clusters with smaller dependency (smaller total temperature slopes) might be pre-selected for production of high accuracy inertial navigation modules. It was found that no unified thermal profile (“family” curve) exists for all sensors in a production batch. However, obviously, sensors might be grouped according to their parameters. Therefore, the temperature compensation profiles might be regressed for each group. 12 slope coefficients on 5 degrees temperature intervals from 0°C to +60°C were used as the features for the k-means++ clustering algorithm.
The minimum number of clusters for all sensors to be well separated from each other by bias thermal profiles in our case is 6. It was found by applying the elbow method. For each cluster a regression curve can be obtained.
Patients with focal ventricular tachycardia are at risk of hemodynamic failure and if no treatment is provided the mortality rate can exceed 30%. Therefore, medical professionals must be adequately trained in the management of these conditions. To achieve the best treatment, the origin of the abnormality should be known, as well as the course of the disease. This study provides an opportunity to visualize various focal ventricular tachycardias using the Offenburg cardiac rhythm model.
Artificial Intelligence (AI) can potentially transform many aspects of modern society in various ways, including automation of tasks, personalization of products and services, diagnosis of diseases and their treatment, transportation, safety, and security in public spaces, etc. Recently, AI technology has been transforming the financial industry, offering new ways to analyse data and automate processes, reduce costs, increase efficiency, and provide more personalized services to customers. However, it also raised important ethical and regulatory questions that need to be addressed by the industry and society as a whole. The aim of the Erasmus+ project Transversal Skills in Applied Artificial Intelligence - TSAAI (KA220-HED - Cooperation Partnerships in higher education) has been to establish a training platform that will incorporate teaching guidelines based on a curriculum covering different areas of application of AI technology. In this work, we will focus on applying AI models in the financial and insurance sectors.
The German Weather Service (DWD) releases a heat warning, when the weather forecast provides a warm, humid, sunny, and windless weather condition during the next days. The heat stress is calculated by the so called Klima-Michel model. If the apparent air temperature exceeds ca. 32°C / 38°C, there is a strong / extreme heat stress. The smallest forecast area is each administrative district. As people (and especially the vulnerable population) stay most of the time indoors, the heat health warning system was extended by the prediction of heat stress in typical rooms. Therewith it is feasible to forecast the heat stress using a combination of the outdoor and indoor heat stress. The prediction for the indoor heat stress is based on the same weather forecast like the Heat Health Warning Systems (HHWS).and calculates the heat stress by the PMV-model (predicted mean vote). Based on a sophisticated data analysis and simulation study, realistic but summer-critical living situations were defined and implemented in the building simulation program ESP-r. As the simulation runs especially for extreme weather conditions, a simplified building model cannot be used. Standardized input/output routines and an adaptive handover of start values provide for short run times for each forecast area. Good building designs and urban planning provide effective measures to reduce heat stress in cities. However, we have to also pay attention to the present building stock under climate change and a higher heat-wave risk. The extended German HHWS provide information for the emergency services to support the social assistants during heat waves.
This study presents some results from a monitoring project with night ventilation and earthto-air heat exchanger. Both techniques refer to air-based low-energy cooling. As these technologies are limited to specific boundary conditions (e.g. moderate summer climate, low temperatures during night, or low ground temperatures, respectively), water-based low-energy cooling may be preferred in many projects. A comparison of the night-ventilated building with a ground-cooled building shows major differences in both concepts.
Background: A disturbed synchronization of the ventricular contraction can cause a highly developed systolic heart failure in affected patients, which can often be explained by a diseased left bundle branch block (LBBB). If medication remains unresponsive, the concerned patients will be treated with a cardiac resynchronization therapy (CRT) system. The aim of this study was to integrate His bundle pacing into the Offenburg heart rhythm model in order to visualize the electrical pacing field generated by His bundle pacing.
Methods: Modelling and electrical field simulation activities were performed with the software CST (Computer Simulation Technology) from Dessault Systèms. CRT with biventricular pacing is to be achieved by an apical right ventricular electrode and an additional left ventricular electrode, which is floated into the coronary vein sinus. This conventional type of biventricular pacing leads to a reduction of the left ventricular ejection fraction. Furthermore, the non-responder rate of the CRT therapy is about one third of the CRT patients.
Results: His bundle pacing represents a physiological alternative to conventional cardiac pacing and cardiac resynchronization. An electrode implanted in the His bundle emits a stronger electrical pacing field than the electrical pacing field of conventional cardiac pacemakers. The pacing of the His bundle was performed by the Medtronic Select Secure 3830 electrode with pacing voltage amplitudes of 3 V, 2 V and 1.5 V in combination with a pacing pulse duration of 1 ms.
Conclusions: Compared to conventional cardiac pacemaker pacing, His bundle pacing is capable of bridging LBBB conduction disorders in the left ventricle. The His bundle pacing electrical field is able to spread via the physiological pathway in the right and left ventricles for CRT with a narrow QRS-complex in the surface ECG.
Artificial intelligence (AI), and in particular machine learning algorithms, are of increasing importance in many application areas but interpretability and understandability as well as responsibility, accountability, and fairness of the algorithms' results, all crucial for increasing the humans' trust into the systems, are still largely missing. Big industrial players, including Google, Microsoft, and Apple, have become aware of this gap and recently published their own guidelines for the use of AI in order to promote fairness, trust, interpretability, and other goals. Interactive visualization is one of the technologies that may help to increase trust in AI systems. During the seminar, we discussed the requirements for trustworthy AI systems as well as the technological possibilities provided by interactive visualizations to increase human trust in AI.