Refine
Year of publication
Document Type
- Conference Proceeding (164) (remove)
Conference Type
- Konferenzartikel (94)
- Konferenzband (64)
- Konferenz-Abstract (3)
- Sonstiges (2)
- Konferenz-Poster (1)
Has Fulltext
- yes (164) (remove)
Keywords
- Mikroelektronik (62)
- Communication Systems (3)
- Information Systems (3)
- Sound Synthesis (3)
- Biologische Methanisierung (2)
- Deep Learning (2)
- Generative Art (2)
- Improvisation (2)
- Simulation-based Interaction (2)
- VR (2)
Institute
- Fakultät Elektrotechnik und Informationstechnik (E+I) (bis 03/2019) (104)
- Fakultät Maschinenbau und Verfahrenstechnik (M+V) (26)
- INES - Institut für nachhaltige Energiesysteme (14)
- Fakultät Elektrotechnik, Medizintechnik und Informatik (EMI) (ab 04/2019) (11)
- Fakultät Medien (M) (ab 22.04.2021) (8)
- ivESK - Institut für verlässliche Embedded Systems und Kommunikationselektronik (8)
- Fakultät Medien und Informationswesen (M+I) (bis 21.04.2021) (5)
- Fakultät Wirtschaft (W) (5)
- ACI - Affective and Cognitive Institute (2)
- POIM - Peter Osypka Institute of Medical Engineering (2)
Open Access
- Open Access (159)
- Bronze (118)
- Diamond (10)
- Grün (7)
- Closed (4)
- Gold (3)
- Closed Access (1)
- Hybrid (1)
Organized by the Fraunhofer Additive Manufacturing Alliance, the bi-annual Direct Digital Manufacturing Conference brings together researchers, educators and practitioners from around the world. The conference covers the entire range of topics in additive manufacturing, starting with methodologies, design and simulation, right up to more application-specific topics, e.g. from the realm of medical engineering and electronics.
Enhancing engineering creativity with automated formulation of elementary solution principles
(2023)
The paper describes a method for the automated formulation of elementary creative stimuli for product or process design at different levels of abstraction and in different engineering domains. The experimental study evaluates the impact of structured automated idea generation on inventive thinking in engineering design and compares it with previous experimental studies in educational and industrial settings. The outlook highlights the benefits of using automated ideation in the context of AI-assisted invention and innovation.
Established robot manufacturers have developed methods to determine and optimize the accuracy of their robots. These methods vary from robot manufacturers to their competitors. Due to the lack of published data, a comparison of robot performance is difficult. The aim of this article is to find methods to evaluate important characteristics of a robot with an accurate and cost-effective setup. A laser triangulation sensor and geometric referenced spheres were used as a base to compare the robot performance.
The paper compares different anti-windup strategies for the current control of inverter-fed permanent magnet synchronous machines (PMSM) controlled by pulse-width modulation. In this respect, the focus is on the drive behavior with a relatively large product of stator frequency and sampling time. A requirement for dynamically high-quality anti-windup measures is, among other things, a sufficiently accurate decoupling of the stator current direct axis and quadrature axis components even at high stator frequencies. Discrete-time models of the electrical subsystem of the PMSM are well suited for this purpose, of which the method found to be the most accurate in a preliminary investigation is used as the basis for all anti-windup methods examined. Simulation studies and measurement results document the performance of the compared methods.
In 2015, Google engineer Alexander Mordvintsev presented DeepDream as technique to visualise the feature analysis capabilities of deep neural networks that have been trained on image classification tasks. For a brief moment, this technique enjoyed some popularity among scientists, artists, and the general public because of its capability to create seemingly hallucinatory synthetic images. But soon after, research moved on to generative models capable of producing more diverse and more realistic synthetic images. At the same time, the means of interaction with these models have shifted away from a direct manipulation of algorithmic properties towards a predominance of high level controls that obscure the model's internal working. In this paper, we present research that returns to DeepDream to assess its suit-ability as method for sound synthesis. We consider this research to be necessary for two reasons: it tackles a perceived lack of research on musical applications of DeepDream, and it addresses DeepDream's potential to combine data driven and algorithmic approaches. Our research includes a study of how the model architecture, choice of audio data-sets, and method of audio processing influence the acoustic characteristics of the synthesised sounds. We also look into the potential application of DeepDream in a live-performance setting. For this reason, the study limits itself to models consisting of small neural networks that process time-domain representations of audio. These models are resource-friendly enough to operate in real time. We hope that the results obtained so far highlight the attractiveness of Deep-Dream for musical approaches that combine algorithmic investigation with curiosity driven and open ended exploration.
Digital, virtual environments and the metaverse are rapidly taking shape and will generate disruptive changes in the areas of ethics, privacy, safety, and how the relationships between human beings will be developed. To uncover some of some of the implications that will impact those areas, this study investigates the perceptions of 101 younger people from the generations Y and Z. We present a first exploratory analysis of the findings, focusing on knowledge and self-perception. Results show that these young generations are seriously doubting their knowledge on the metaverse and virtual worlds – regarding both the definition and the usage. It is interesting to see only a medium confidence level, considering that the participants are young and from an academic environment, which should increase their interest in and the affinity towards virtual worlds. Males from both generations perceive themselves as significantly more knowledgeable than females. Regarding a fitting definition, almost 40% agreed on the metaverse as a “universal and immersive virtual world that is made accessible using virtual reality and augmented reality technologies”. Regarding the topic in general, several participants (almost 40%) considered themselves sceptics or “just” users (38%). Interestingly, generation Y participants were more likely than the younger generation Z participants to identify themselves as early adopters or innovators. In result, the considerable amount of “mixed feelings” regarding digital, virtual environments and the metaverse shows that in-depth studies on the perception of the metaverse as well as its ethical and integrity implications are required to create more accessible, inclusive, safe, and inclusive digital, virtual environments.
Variable refrigerant flow (VRF) and variable air volume (VAV) systems are considered among the best heating, ventilation, and air conditioning systems (HVAC) thanks to their ability to provide cooling and heating in different thermal zones of the same building. As well as their ability to recover the heat rejected from spaces requiring cooling and reuse it to heat another space. Nevertheless, at the same time, these systems are considered one of the most energy-consuming systems in the building. So, it is crucial to well size the system according to the building’s cooling and heating needs and the indoor temperature fluctuations. This study aims to compare these two energy systems by conducting an energy model simulation of a real building under a semi-arid climate for cooling and heating periods. The developed building energy model (BEM) was validated and calibrated using measured and simulated indoor air temperature and energy consumption data. The study aims to evaluate the effect of these HVAC systems on energy consumption and the indoor thermal comfort of the building. The numerical model was based on the Energy Plus simulation engine. The approach used in this paper has allowed us to reach significant quantitative energy saving along with a high level of indoor thermal comfort by using the VRF system compared to the VAV system. The findings prove that the VRF system provides 46.18% of the annual total heating energy savings and 6.14% of the annual cooling and ventilation energy savings compared to the VAV system.
The variable refrigerant flow system is one of the best heating, ventilation, and air conditioning systems (HVAC) thanks to its ability to provide thermal comfort inside buildings. But, at the same time, these systems are considered one of the most energy-consuming systems in the building sector. Thus, it is crucial to well size the system according to the building’s cooling and heating needs and the indoor temperature fluctuations. Although many researchers have studied the optimization of the building energy performance considering heating or cooling needs, using air handling units, radiant floor heating, and direct expansion valves, few studies have considered the use of multi-objective optimization using only the thermostat setpoints of VRF systems for both cooling and heating needs. Thus, the main aim of this study is to conduct a sensitivity analysis and a multi-objective optimization strategy for a residential building containing a variable refrigerant flow system, to evaluate the effect of the building performance on energy consumption and improve the building energy efficiency. The numerical model was based on the EnergyPlus, jEPlus, and jEPlus+EA simulation engines. The approach used in this paper has allowed us to reach significant quantitative energy saving by varying the cooling and heating setpoints and scheduling scenarios. It should be stressed that this approach could be applied to several HVAC systems to reduce energy-building consumption.
The increasing diffusion of rapidly developing AI technologies led to the idea of the experiment to combine TRIZ-based automated idea generation with the natural language processing tool ChatGPT, using the chatbot to interpret the automatically generated elementary solution principles. The article explores the opportunities and benefits of a novel AI-enhanced approach to teaching systematic innovation, analyses the learning experience, identifies the factors that affect students' innovation and problem-solving performance, and highlights the main difficulties students face, especially in interdisciplinary problems.
Inner Congo
(2023)
This research-creation project, part of the DE\GLOBALIZE artistic research cycle presented at the #IFM2022 Conference, investigates the complexities of Congo violence, care, and colonialism. Drawing on Michel Serres' metaphor of the great estuaries, the study explores the topology of interactive documentaries, blending theory, emotion, and personal experiences. Accessible through the interactive web documentation at http://deglobalize.com, the platform offers a media-archaeological archive for speculative ethnography, enabling the forensic processing of single documents in line with actor-network theory.
Generative machine learning models for creative purposes play an increasingly prominent role in the field of dance and technology. A particularly popular approach is the use of such models for generating synthetic motions. Such motions can either serve as source of ideation for choreographers or control an artificial dancer that acts as improvisation partner for human dancers. Several examples employ autoencoder-based deep-learning architectures that have been trained on motion capture recordings of human dancers. Synthetic motions are then generated by navigating the autoencoder's latent space. This paper proposes an alternative approach of using an autoencoder for creating synthetic motions. This approach controls the generation of synthetic motions on the level of the motion itself rather than its encoding. Two different methods are presented that follow this principle. Both methods are based on the interactive control of a single joint of an artificial dancer while the other joints remain under the control of the autoencoder. The first method combines the control of the orientation of a joint with iterative autoencoding. The second method combines the control of the target position of a joint with forward kinematics and the application of latent difference vectors. As illustrative example of an artistic application, this latter method is used for an artificial dancer that plays a digital instrument. The paper presents the implementation of these two methods and provides some preliminary results.
Teaching and learning concepts that are adapted to the constantly evolving requirements due to rapid technological progress are essential for teaching in media photonics technology. After the development of a concept for research-oriented education in optics and photonics, the next step will be a conceptual restructuring and redesign of the entire curriculum for education in media photonics technology. By including typical research activities as essential components of the learning process, a broad platform for practical projects and applied research can be created, offering a variety of new development opportunities.
Lithium-ion batteries show strongly nonlinear behaviour regarding the battery current and state of charge. Therefore, the modelling of lithium-ion batteries is complex. Combining physical and data-driven models in a grey-box model can simplify the modelling. Our focus is on using neural networks, especially neural ordinary differential equations, for grey-box modelling of lithium-ion batteries. A simple equivalent circuit model serves as a basis for the grey-box model. Unknown parameters and dependencies are then replaced by learnable parameters and neural networks. We use experimental full-cycle data and data from pulse tests of a lithium iron phosphate cell to train the model. Finally, we test the model against two dynamic load profiles: one consisting of half cycles and one dynamic load profile representing a home-storage system. The dynamic response of the battery is well captured by the model.
Robust scheduling problem is a major decision problem that is addressed in the literature, especially for remanufacturing systems; this problem is complex because of the high uncertainty and complex constraints involved. Generally, the existing approaches are dedicated to specific processes and do not enable the quick and efficient generation and evaluation of schedules. With the emergence of the Industry 4.0 paradigm, data availability is now considered an opportunity to facilitate the decision-making process. In this study, a data-driven decisionmaking process is proposed to treat the robust scheduling problem of remanufacturing systems in uncertain environments. In particular, this process generates simulation models based on a data-driven modeling approach. A robustness evaluation approach is proposed to answer several decision questions. An application of the decision process in an industrial case of a remanufacturing system is presented herein, illustrating the impact of robustness evaluation results on real-life decisions.
An import ban of Russian energy sources to Germany is currently being increasingly discussed. We want to support the discussion by showing a way how the electricity system in Germany can manage low energy imports in the short term and which measures are necessary to still meet the climate protection targets. In this paper, we examine the impact of a complete stop of Russian fossil fuel imports on the electricity sector in Germany, and how this will affect the climate coals of an earlier coal phase-out and climate neutrality by 2045.
Following a scenario-based analysis, the results gave a point of view on how much would be needed to completely rely on the scarce non-renewable energy resources in Germany. Huge amounts of investments would be needed in order to ensure a secure supply of electricity, in both generation energy sources (RES) and energy storage systems (ESS). The key findings are that a rapid expansion of renewables and storage technologies will significantly reduce the dependence of the German electricity system on energy imports. The huge integration of renewable energy does not entail any significant imports of the energy sources natural gas, hard coal, and mineral oil, even in the long term. The results showed that a ban on fossil fuel imports from Russia outlines huge opportunities to go beyond the German government's climate targets, where the 1.5-degree-target is achieved in the electricity system.
Peer-to-peer energy trading and local electricity markets have been widely discussed as new options for the transformation of the energy system from the traditional centralized scheme to the novel decentralized one. Moreover, it has also been proposed as a more favourable alternative for already expiring feed in tariff policies that promote investment in renewable energy sources. Peer-to-peer energy trading is usually defined as the integration of several innovative technologies, that enable both prosumers and consumers to trade electricity, without intermediaries, at a consented price. Furthermore, the techno-economic aspects go hand in hand with the socio-economic aspects, which represent at the end significant barriers that need to be tackled to reach a higher impact on current power systems. Applying a qualitative analysis, two scalable peer-to-peer concepts are presented in this study and the possible participant´s entry probability into such concepts. Results show that consumers with a preference for environmental aspects have in general a higher willingness to participate in peer-to-peer energy trading. Moreover, battery storage systems are a key technology that could elevate the entry probability of prosumers into a peer-to-peer market.
In the railway technical centers, scheduling the maintenance activities is a very complex task, it consists in ordering, in the time, all the maintenance operations on the workstations, while respecting the number of resources, precedence constraints, and the workstations' availabilities. Currently, this process is not completely automatic. For improving this situation, this paper presents a mathematical model for the maintenance activities scheduling in the case of railway remanufacturing systems. The studied problem is modeled as a flexible job-shop, with the possibility for a job to be executed several times on a stage. MILP formulation is implemented with the Makespan as an objective, representing the time for remanufacturing the train. The aim is to create a generic model for optimizing the planning of the maintenance activities and improving the performance of the railway technical centers. At last, numerical results are presented, discussing the impact of the instances size on the computing time to solve the described problem.
To achieve Germany's climate targets, the industrial sector, among others, must be transformed. The decarbonization of industry through the electrification of heating processes is a promising option. In order to investigate this transformation in energy system models, high-resolution temporal demand profiles of the heat and electricity applications for different industries are required. This paper presents a method for generating synthetic electricity and heat load profiles for 14 industry types. Using this methodology, annual profiles with a 15-minute resolution can be generated for both energy demands. First, daily profiles for the electricity demand were generated for 4 different production days. These daily profiles are additionally subdivided into eight end-use application categories. Finally, white noise is applied to the profile of the mechanical drives. The heat profile is similar to the electrical but is subdivided into four temperature ranges and the two applications hot water and space heating. The space heating application is additionally adjusted to the average monthly outdoor temperature. Both time series were generated for the analysis of an electrification of industrial heat application in energy system modelling.
The energy system is changing since some years in order to achieve the climate goals from the Paris Agreement which wants to prevent an increase of the global temperature above 2 °C [1]. Decarbonisation of the energy system has become for governments a big challenge and different strategies are being stablished. Germany has set greenhouse gas reduction limits for different years and keeps track of the improvement made yearly. The expansion of renewable energy systems (RES) together with decarbonisation technologies are a key factor to accomplish this objective.
This research is done to analyse the effect of introducing biochar, a decarbonisation technology, and study how it will affect the energy system. Pyrolysis is the process from which biochar is obtained and it is modelled in an open-source energy system model. A sensibility analysis is done in order to assess the effect of changing the biomass potential and the costs for pyrolysis.
The role of pyrolysis is analysed in the form of different future scenarios for the year 2045 to evaluate the impact when the CO2 emission limit is zero. All scenarios are compared to the reference scenario, where pyrolysis is not considered.
Results show that biochar can be used to compensate the emissions from other conventional power plant and achieve an energy transition with lower costs. Furthermore, it was also found that pyrolysis can also reduce the need of flexibility. This study also shows that the biomass potential and the pyrolysis costs can strongly affect the behaviour of pyrolysis in the energy system.
The contribution of the RoofKIT student team to the SDE 21/22 competition is the extension of an existing café in Wuppertal, Germany, to create new functions and living space for the building with simultaneous energetic upgrading. A demonstration unit is built representing a small cut-out of this extension. The developed energy concept was thoroughly simulated by the student team in seminars using Modelica. The system uses mainly solar energy via PVT collectors as the heat source for a brine-water heat pump (space heating and hot water). Energy storage (thermal and electrical) is installed to decouple generation and consumption. Simulation results confirm that carbon neutrality is achieved for the building operation, consuming and generating around 60 kWh/m2a.
Im Projekt „BioMeth“ wurden zwei neuartige und bislang noch nicht für die biologische Methanisierung beschriebene Anlagenkonzepte entwickelt. Der neuentwickelte Invers-Membranreaktor (IMR) ermöglicht es, den Eintrag der erforderlichen Eduktgase Wasserstoff H2 und Kohlendioxid CO2 über kommerziell erhältliche Ultrafiltrationsmembranen und den Entgasungsbereich für den Methanaustrag räumlich zu trennen und zusätzlich einen hydraulischen Druck zur Steigerung des Wasserstoffeintrages zu nutzen. Ein Vorteil des Verfahrens ist, dass perspektivisch sowohl das CO2 aus klassischem Biogas als auch CO2-Quellen aus industriellen Abluftströmen, z. B. aus der Zementindustrie als Kohlenstoffquelle genutzt werden können.
Über die biologische Methanisierung hinaus eignet sich der Invers-Membranreaktor der Einschätzung der Autoren nach auch generell zur biotechnologischen Herstellung nicht-flüchtiger Wertstoffe ausgehend von gasförmigen Substraten. Im IMR kann z. B. ein Membranmodul zum Eintrag der Eduktgase verwendet werden, während ein weiteres Hohlmembranmodul zur zyklischen oder kontinuierlichen Abtrennung der wertstoffhaltigen Reaktionslösung unter Rückhaltung der Mikrobiologie im Sinne eines In-situ Product Recovery (ISPR)-Konzeptes genutzt werden kann.
Als herausragendes Ergebnis erwies sich während der Untersuchung des IMR, dass mit dem Konzept der Membranbegasung CH4-Konzentrationen von > 90 Vol.-% über eine einjährige Versuchsreihe kontinuierlich und mit flexiblem Gaseintrag erzielt werden konnten. Nach Inbetriebnahme war dabei außer der Zugabe von H2 und CO2 als Energie- bzw. C-Quelle lediglich eine zweimalige Ergänzung von Supplementen erforderlich. Die maximal erreichte membranflächen-spezifische Methanbildungsrate ohne Gaszirkulation lag bei 83 LN Methan pro m2 Membranfläche und Tag bei einer Produktgaszusammensetzung von 94 Vol.% Methan, 2 Vol.% H2, und 4 Vol.% CO2.
Das zweite noch in der frühen Testphase befindliche Verfahren nutzt Druckunterschiede in einer 10 m hohen gepackten Gegenstromblasensäule, die mit einem ebenfalls 10 m hohen separaten Entgasungs-Reaktor kombiniert wurde. Diese Verfahrenskonzept soll es ermöglichen, eine hohe Wasserstofflöslichkeit aufgrund des am Säulenfuß vorliegenden hydrostatischen Druckes zu erreichen und dabei gleichzeitig den Energiebedarf zu minimieren, die Investitionskosten zu reduzieren und optimale zeitliche und räumlichen Bedingungen für die mikrobiologische Umsetzung von H2 und CO2 zu schaffen. Erste Untersuchungen am Gegenstromblasensäulenreaktor zum Stoffübergang von Luft bestätigten eine gute Anreicherung der im Kreislauf geführten Flüssigkeit bereits bei verhältnismäßig niedrigen Gasleerrohrgeschwindigkeiten. In der zweiten Säule des Reaktoraufbaus sollte am Kopf aufgrund der Druckentspannung ein Ausgasen der im Vergleich zu Atmosphärendruck mit Gas übersättigten Flüssigkeit erfolgen. Das Ausgasen der Flüssigkeit konnte ebenfalls am Beispiel des Lufteintrages bestätigt werden.
In recent years, the topic of embedded machine learning has become very popular in AI research. With the help of various compression techniques such as pruning, quantization and others compression techniques, it became possible to run neural networks on embedded devices. These techniques have opened up a whole new application area for machine learning. They range from smart products such as voice assistants to smart sensors that are needed in robotics. Despite the achievements in embedded machine learning, efficient algorithms for training neural networks in constrained domains are still lacking. Training on embedded devices will open up further fields of applications. Efficient training algorithms would enable federated learning on embedded devices, in which the data remains where it was collected, or retraining of neural networks in different domains. In this paper, we summarize techniques that make training on embedded devices possible. We first describe the need and requirements for such algorithms. Then we examine existing techniques that address training in resource-constrained environments as well as techniques that are also suitable for training on embedded devices, such as incremental learning. At the end, we also discuss which problems and open questions still need to be solved in these areas.
During the coronavirus crisis, labs had to be offered in digital form in mechanical engineering at short notice. For this purpose, digital twins of more complex test benches in the field of fluid energy machines were used in the mechanical engineering course, with which the students were able to interact remotely to obtain measurement data. The concept of the respective lab was revised with regard to its implementation as a remote laboratory. Fortunately, real-world labs were able to be fully replaced by remote labs. Student perceptions of remote labs were mostly positive. This paper explains the concept and design of the digital twins and the lab as well as the layout, procedure, and finally the results of the accompanying evaluation. However, the implementation of the digital twins to date does not yet include features that address the tactile experience of working in real-world labs.
In this paper, we study the runtime performance of symmetric cryptographic algorithms on an embedded ARM Cortex-M4 platform. Symmetric cryptographic algorithms can serve to protect the integrity and optionally, if supported by the algorithm, the confidentiality of data. A broad range of well-established algorithms exists, where the different algorithms typically have different properties and come with different computational complexity. On deeply embedded systems, the overhead imposed by cryptographic operations may be significant. We execute the algorithms AES-GCM, ChaCha20-Poly1305, HMAC-SHA256, KMAC, and SipHash on an STM32 embedded microcontroller and benchmark the execution times of the algorithms as a function of the input lengths.
The sharp rise in electricity and oil prices due to the war in Ukraine has caused fluctuations in the results of the previous study about the economic analysis of electric buses. This paper shows how the increase in fuel prices affects the implementation of electric buses. This publication is constructing the Total Cost of Ownership (TCO) model in the small-mid-size city, Offenburg for the transition to electric buses. The future development of costs is estimated and a projection based on learning curves will be carried out. This study intends to introduce a new future prospect by presenting the latest data based on previous research. Through the new TCO result, the cost differences between the existing diesel bus and the electric bus are updated, and also the future prospects for the economic feasibility of the electric bus in a small and midsize city are presented.
Elektronische Türschilder zur Darstellung von Informationen sind insbesondere in öffentlichen Gebäuden zwischenzeitlich weit verbreitet. Die Varianz dieser elektronischen Türschilder reicht vom Tablet-basierten Türschild bis hin zum PC-basierten Türschild mit externem Bildschirm. Zumeist werden die Systeme mit 230 V betrieben. Bei einer großen Summe von Türschildern in öffentlichen Gebäuden kann dies zu einem signifikanten Umsatz an Energie führen. Im Rahmen dieses Papers wird die Entwicklung eines energieautarken arbeiten Türschildes vorgestellt, bei dem ein E-Paper-Display zum Einsatz kommt. Das Türschild lässt sich per Smartphone-App und NFC-Schnittstelle konfigurieren. Es wird insbesondere auf das Low-Power-Hardware-Design der Elektronik und energetische Aspekte eingegangen.
This paper describes the authors' first experiments in creating an artificial dancer whose movements are generated through a combination of algorithmic and interactive techniques with machine learning. This approach is inspired by the time honoured practice of puppeteering. In puppeteering, an articulated but inanimate object seemingly comes to live through the combined effects of a human controlling select limbs of a puppet while the rest of the puppet's body moves according to gravity and mechanics. In the approach described here, the puppet is a machine-learning-based artificial character that has been trained on motion capture recordings of a human dancer. A single limb of this character is controlled either manually or algorithmically while the machine-learning system takes over the role of physics in controlling the remainder of the character's body. But rather than imitating physics, the machine-learning system generates body movements that are reminiscent of the particular style and technique of the dancer who was originally recorded for acquiring training data. More specifically, the machine-learning system operates by searching for body movements that are not only similar to the training material but that it also considers compatible with the externally controlled limb. As a result, the character playing the role of a puppet is no longer passively responding to the puppeteer but makes movement decisions on its own. This form of puppeteering establishes a form of dialogue between puppeteer and puppet in which both improvise together, and in which the puppet exhibits some of the creative idiosyncrasies of the original human dancer.
Strings P
(2021)
Strings is an audiovisual performance for an acoustic violin and two generative instruments, one for creating synthetic sounds and one for creating synthetic imagery. The three instruments are related to each other conceptually , technically, and aesthetically by sharing the same physical principle, that of a vibrating string. This submission continues the work the authors have previously published at xCoAx 2020. The current submission briefly summarizes the previous publication and then describes the changes that have been made to Strings. The P in the title emphasizes, that most of these changes have been informed by experiences collected during rehearsals (in German Proben). These changes have helped Strings to progress from a predominantly technical framework to a work that is ready for performance.
Active participation of industrial enterprises in electricity markets - a generic modeling approach
(2021)
Industrial enterprises represent a significant portion of electricity consumers with the potential of providing demand-side energy flexibility from their production processes and on-site energy assets. Methods are needed for the active and profitable participation of such enterprises in the electricity markets especially with variable prices, where the energy flexibility available in their manufacturing, utility and energy systems can be assessed and quantified. This paper presents a generic model library equipped with optimal control for energy flexibility purposes. The components in the model library represent the different technical units of an industrial enterprise on material, media, and energy flow levels with their process constraints. The paper also presents a case study simulation of a steel-powder manufacturing plant using the model library. Its energy flexibility was assessed when the plant procured its electrical energy at fixed and variable electricity prices. In the simulated case study, flexibility use at dynamic prices resulted in a 6% cost reduction compared to a fixed-price scenario, with battery storage and the manufacturing system making the largest contributions to flexibility.
A coordinated operation of decentralised micro-scale hybrid energy systems within a locally managed network such as a district or neighbourhood will play a significant role in the sector-coupled energy grid of the future. A quantitative analysis of the effects of the primary energy factors, energy conversion efficiencies, load profiles, and control strategies on their energy-economic balance can aid in identifying important trends concerning their deployment within such a network. In this contribution, an analysis of the operational data from five energy laboratories in the trinational Upper-Rhine region is evaluated and a comparison to a conventional reference system is presented. Ten exemplary data-sets representing typical operation conditions for the laboratories in different seasons and the latest information on their national energy strategies are used to evaluate the primary energy consumption, CO2 emissions, and demand-related costs. Various conclusions on the ecologic and economic feasibility of hybrid building energy systems are drawn to provide a toe-hold to the engineering community in their planning and development.
Activities for rehabilitation and prevention are often lengthy and associated with pain and frustration. Their playful enrichment (hereafter: gamification) can counteract this, resulting in so-called “exergames”. However, in contrast to games designed solely for entertainment, the increased motivation and immersion in gamified training can lead to a reduced perception of pain and thus to health deterioration. Therefore, it is necessary to monitor activities continuously. However, only an AI-based system able to generate autonomous interventions could vacate the therapists’ costly time and allow better training at home. An automated adjustment of the movement training’s difficulty as well as individualized goal setting and control are essential to achieve such autonomy. This article’s contribution is two-fold: (1) We portray the potentials of gamification in the health area. (2) We present a framework for smart rehabilitation and prevention training allowing autonomous, dynamic, and gamified interactions.
Patients with focal ventricular tachycardia are at risk of hemodynamic failure and if no treatment is provided the mortality rate can exceed 30%. Therefore, medical professionals must be adequately trained in the management of these conditions. To achieve the best treatment, the origin of the abnormality should be known, as well as the course of the disease. This study provides an opportunity to visualize various focal ventricular tachycardias using the Offenburg cardiac rhythm model.
Disturbances of the cardiac conduction system causing reentry mechanisms above the atrioventricular (AV) node are induced by at least one accessory pathway with different conducting properties and refractory periods. This work aims to further develop the already existing and continuously expanding Offenburg heart rhythm model to visualise the most common supraventricular reentry tachycardias to provide a better understanding of the cause of the respective reentry mechanism.
Grey-box modelling combines physical and data-driven models to benefit from their respective advantages. Neural ordinary differential equations (NODEs) offer new possibilities for grey-box modelling, as differential equations given by physical laws and neural networks can be combined in a single modelling framework. This simplifies the simulation and optimization and allows to consider irregularly-sampled data during training and evaluation of the model. We demonstrate this approach using two levels of model complexity; first, a simple parallel resistor-capacitor circuit; and second, an equivalent circuit model of a lithium-ion battery cell, where the change of the voltage drop over the resistor-capacitor circuit including its dependence on current and State-of-Charge is implemented as NODE. After training, both models show good agreement with analytical solutions respectively with experimental data.
Most recently, the federal government in Germany published new climate goals in order reach climate neutrality by 2045. This paper demonstrates a path to a cost optimal energy supply system for the German power grid until the year 2050. With special regard to regionality, the system is based on yearly myopic optimization with the required energy system transformation measures and the associated system costs. The results point out, that energy storage systems (ESS) are fundamental for renewables integration in order to have a feasible energy transition. Moreover, the investment in storage technologies increased the usage of the solar and wind technologies. Solar energy investments were highly accompanied with the installation of short-term battery storage. Longer-term storage technologies, such as H2, were accompanied with high installations of wind technologies. The results pointed out that hydrogen investments are expected to overrule short-term batteries if their cost continues to decrease sharply. Moreover, with a strong presence of ESS in the energy system, biomass energy is expected to be completely ruled out from the energy mix. With the current emission reduction strategy and without a strong presence of large scale ESS into the system, it is unlikely that the Paris agreement 2° C target by 2050 will be achieved, let alone the 1.5° C.
This paper describes a taxonomy which allows to assess and compare different implementations of master data objects. A systematic breakdown of core entities provides a framework to tell apart four subdividing categories of master data objects: independent and dependent objects, relational objects, and reference objects that serve to attribute information. This supports the preparation of data migrations from one system to another.
The present work ties in with the problem of bicycle road assessment that is currently done using expensive special measuring vehicles. Our alternative approach for road condition assessment is to mount a sensor device on a bicycle which sends accelerometer and gyroscope data via WiFi to a classification server. There, a prediction model determines road type and condition based on the sensor data. For the classification task, we compare different machine learning methods with each other, whereby validation accuracies of 99% can be achieved with deep residual networks such as InceptionTime. The main contribution of this work with respect to comparable work is that we achieve excellent accuracies on a realistic dataset classifying road conditions into nine distinct classes that are highly relevant for practice.
MPC-Workshop Februar 2020
(2021)
Die angestrebten Klimaschutzziele erfordern, dass Erneuerbare Energien längerfristig zur Hauptenergiequelle der Energieversorgung werden. Um dieses ehrgeizige Ziel zu erreichen, ist es angebracht konventionelle und erneuerbare Energie oder noch besser nachhaltige Einzelprozesse intelligent miteinander zu verknüpfen.
Das Projekt EBIPREP wird von einer interdisziplinären Forschergruppe bestehend aus Chemikern, Prozessingenieuren und Bioprozessingenieuren sowie Physikern, die auf Sensoren und Prozesssteuerung spezialisiert sind durchgeführt. Das Ziel ist es, neue Lösungen für die Nutzungswege von Holzhackschnitzeln und den bei der mechanischen Trocknung anfallenden Holzpresssaft zu entwickeln. Neben der Hackschnitzelvergasung und der katalytischen Reinigung des Holzgases steht die Nutzung des Holzpresssafts in Biogasanlagen und bei der biotechnologischen Wertstofferzeugung, z.B. bei der Enzymherstellung, im Vordergrund.
Was wir tun?
Das EBIPREP-Projekt wird von einer interdisziplinären Forschungsgruppe durchgeführt, die sich aus Chemikern, Prozessingenieuren, Bioprozessingenieuren und Physikern zusammensetzt. Ziel ist es, neue Lösungen für den Einsatz von Hackschnitzeln und Holzpresssaft zu entwickeln, die durch ein innovatives mechanisches Trocknungsverfahren gewonnen werden. Neben der Holzvergasung und katalytischen Reinigung des Holzgases ist der Einsatz von Holzpresssaft in Biogasanlagen und in biotechnologischen Produktionsprozessen von Wertstoffen vorgesehen. Holzhackschnitzel werden thermisch vergast. Es werden Online-Sensoren entwickelt, um die relevanten Parameter der stabilisierten und optimierten Einzelprozesse auszuwerten. Die Verknüpfung von thermischen und biotechnologischer Konversionsprozessen könnte dazu beitragen, die Dimension von Biogasreaktoren erheblich zu reduzieren. Diese Tatsache wird folglich zu einer spürbaren Kostensenkung führen.
Ziele des EBIPREP-Projekts
• die Vorteile der thermischen und biologischen Umwandlung von Biomasse zu kombinieren;
• Entwicklung eines Verfahrens zur Reduzierung von Schadstoffemissionen mit innovativen Sensoren und katalytische Behandlung von Synthesegasen;
• nachhaltige Produktion biotechnologischer wertvoller Produkte
• wirtschaftliche und ökologische Analyse des Gesamtprozesses im Vergleich zu den Einzelprozessen
• Einsatz von Prozessabwässern zur Erzeugung regenerativer Energie oder biotechnologischer Wertstoffe
• Erwerb neuer Kenntnisse auf dem Gebiet der Rückgewinnungstechnik von Rückständen
• und Energieerzeugung;
• Erweiterung neuer Anwendungsfelder für innovative Sensoren und Keramik
• Schäume für Katalysatoren;
• Senkung der Kosten für die Biogasproduktion
Im geplanten Übersichtsvortrag werden die vernetzten Strukturen des Projekts EBIPREP und deren zentralen Ergebnisse vorgestellt.
Strings
(2020)
This article presents the currently ongoing development of an audiovisual performance work with the title Strings. This work provides an improvisation setting for a violinist, two laptop performers, and two generative systems. At the core of Strings lies an approach that establishes a strong correlation among all participants by means of a shared physical principle. The physical principle is that of a vibrating string. The article discusses how this principle is used in both natural and simulated forms as main interaction layer between all performers and as natural or generative principle for creating audio and video.
Hochspannungs-Mischstrom-Übertragung (HMÜ) - Eine Ergänzung zu bestehenden Übertragungstechnologien?
(2019)
Bei der Mischstromübertragung wird einem Wechselstrom direkt ein Gleichstrom überlagert. Wechselstrom und Gleichstrom werden also auf dem gleichen Seil geführt.
Dadurch könnten die bereits bestehenden Drehstrom-Übertragungs-Strecken des Übertragungsnetzes genutzt werden.
Durch eine Aufschaltung des Gleichstromes auf vorhandene Freileitungen kann theoretisch bei kurzen Leitungen (<150km) bis zu 50% mehr Wirkleistung und bei großen Übertragungsstrecken (>300km) in etwa eine Verdopplung der übertragbaren Wirkleistung erwartet werden.
Theoretisch betrachtet ist die Mischstrom-Übertragung eine geometrische Addition aller Strom- und Spannungskomponenten, was zu einer Erhöhung der Leiter-Erde-Spannung führt, ohne dabei Einfluss auf die verkettete Spannung zu nehmen.
Außerdem wird die Übertragung von Blindströmen unnötig, da ein natürlicher Betrieb von Leitungen des HDÜ-Netzes empfehlenswert ist.
Die theoretischen Betrachtungen konnten mathematisch bewiesen und die technische Umsetzung mit einem 1:1000-Modellsystem demonstriert und bestätigt werden.
Vorgestellt wird ein Konzept zur biologischen Methanisierung von Wasserstoff direkt in Biogasreaktoren, mit dem durch Membranbegasung der Methangehalt des Biogases auf > 96 % erhöht werden kann. Essentiell zum Erreichen solch hoher Methanwerte sind die Einhaltung eines optimalen pH-Bereichs und die Vermeidung von H2-Akkumulation. Im Falle einer Limitierung der Methanbildungsrate durch den eigentlichen anaeroben Abbauprozess der Biomasse ist auch eine externe Zufuhr von CO2 zur weiteren Methanbildung denkbar. Das Verfahren soll weiter optimiert und in einem von der Deutschen Bundesstiftung Umwelt geförderten Projekt in der Biogasanlage einer regionalen Käserei in der Praxis getestet werden. Die hier angestrebte Kombination aus dezentraler Abfallverwertung und Eigenenergieerzeugung eines lebensmittelverarbeitenden Betriebs unter Einbindung in ein intelligentes Erneuerbare Energien - Konzept soll einen zusätzlichen Mehrwert liefern.
Implementierung von Softcore-Prozessoren und/oder weiteren IPs (Intellectual Property) in FPGAs
(2018)
Die zunehmende Integration von kompletten Systemen auf einem Chip (System-on-Chip, SoC) erfordert auch immer die Integration einer Recheneinheit bzw. eines Prozessorkerns. Möchte man insbesondere Low-Power-SoC-Systeme entwickeln, z.B. drahtlose Sensor-SoC-Systeme für Anwendungen im Rahmen von Industrie 4.0, ist die Implementierung eines solchen Prozessorkerns mit hohen Herausforderungen verbunden. Prinzipiell können hierfür verschiedene Ansätze verfolgt werden, nämlich die Implementierung einer Hardcore Prozessor-IP (IP = Intellectual Property) oder einer Softcore-Prozessor-IP. Im vorliegenden Beitrag wird zunächst auf den derzeitigen Stand der Technik verfügbarer Hardcore- oder Softcore-Prozessoren unter den Randbedingungen der Low-Power-Anforderungen und der weiten Verbreitung des Cores in industriellen Anwendungen eingegangen. Schließlich werden die Ergebnisse der Implementierung und Evaluierung eines derzeit frei verfügbaren 16-bit MSP430-kompatiblen Softcore Prozessors auf einem Altera-Cyclon-FPGA vorgestellt. Aus den Ergebnissen wird ein entsprechendes Fazit für die Implementierung von Low-Power-SoC-Systeme gegeben.
Der Entwurf und die Realisierung gedruckter Schaltungen oder Elektronikkomponenten stellt ein intensives Thema der Forschung dar. Forschungsgruppen beschäftigen sich zunehmend mit der Entwicklung von gedruckten Energy Harvestern, weil diese kostengünstig und einfach herstellbar sind. Das Energy Harvesting (EH) oder auch das ”Mikro Energy Harvesting“ (MEH) bezeichnet die Gewinnung von elektrischer Energie aus der Umgebung, um elektronische Verbraucher zu versorgen, kontinuierliche Leistungen zu erzeugen, das System energieeffizienter zu machen, sowie die Energiespeicherung im Mikrowattbereich zu gewährleisten. Energy Harvesting-Systeme stellen eine Alternative gegenüber der Energieversorgung autarker Low-Power-Elektronik mit Batterien dar. Das Energiemanagement solcher EH-Systeme ist jedoch eine Herausforderung aufgrund der Energieverfügbarkeit und der im Zeitablauf nicht konstanten Verlustleistung. Dieser Beitrag gibt einen Überblick über die derzeit existierenden ultra low-power Energiemanagement Schaltungen für Energy Harvester. Dabei wird insbesondere der Fokus auf gedruckte Energy Harvester gelegt. Es soll aufgezeigt werden, welche Aspekte der vorgestellten Energieversorgungsschaltungen bei der Entwicklung eines Energieversorgungschips für gedruckte Energy Harvester berüucksichtigt werden sollen.
MPC-Workshop Juli 2018
(2018)
A simple measuring method for acquiring the radiation pattern of an ultrawide band Vivaldi antenna is presented. The measuring is performed by combining two identical Vivaldi antennas and some of the intrinsic properties of a stepped-frequency continue wave radar (SFCW radar) in the
range from 1.0 GHz to 6.0 GHz. A stepper-motor provided the azimuthal rotation for one of the antennas from 0 ◦ to 360 ◦. The tests have been performed within the conventional environment (laboratory / office) without using an anechoic chamber or absorbing materials. Special measuring devices have not been used either. This method has been tested with different pairs of Vivaldi antennas and it can be also used for different ones (with little or no change in the system), as long as their operational
bandwidth is within the frequency range of the SFCW radar.
Keywords — SFCW Radar, Antenna Gain Characterization,
Azimuthal Radiation Pattern
The paper is addressing the needs of the universities regarding qualification of students as future R&D specialists in efficient techniques for successfully running innovation process. In comparison with the engineers, the students often demonstrate lower motivation in learning systematic inventive techniques, like for example TRIZ methodology, and prefer random brainstorming for idea generation. The quality of obtained solutions also depends on the level of completeness of the problem analysis, which is more complex and time consuming in the case of interdisciplinary systems. The paper briefly describes one-semester-course of 60 hours in new product development with the Advanced Innovation Design Approach and TRIZ methodology, in which a typical industrial innovation process for one selected interdisciplinary mechatronic product is modelled.
OPC UA (Open Platform Communications Unified Architecture) is already a well-known concept used widely in the automation industry. In the area of factory automation, OPC UA models the underlying field devices such as sensors and actuators in an OPC UA server to allow connecting OPC UA clients to access device-specific information via a standardized information model. One of the requirements of the OPC UA server to represent field device data using its information model is to have advanced knowledge about the properties of the field devices in the form of device descriptions. The international standard IEC 61804 specifies EDDL (Electronic Device Description Language) as a generic language for describing the properties of field devices. In this paper, the authors describe a possibility to dynamically map and integrate field device descriptions based on EDDL into OPCUA.
The Thread protocol is a recent development based on 6LoWPAN (IPv6 over IEEE 802.15.4), but with extensions regarding a more media independent approach, which – additionally – also promises true interoperability. To evaluate and analyse the operation of a Thread network a given open source 6LoWPAN stack for embedded devices (emb::6) has been extended in order to comply with the Thread specification. The implementation covers Mesh Link Establishment (MLE) and network layer functionality as well as 6LoWPAN mesh under routing mechanism based on MAC short addresses. The development has been verified on a virtualization platform and allows dynamical establishment of network topologies based on Thread's partitioning algorithm.
Legacy industrial communication protocols are proved robust and functional. During the last decades, the industry has invented completely new or advanced versions of the legacy communication solutions. However, even with the high adoption rate of these new solutions, still the majority industry applications run on legacy, mostly fieldbus related technologies. Profibus is one of those technologies that still keep on growing in the market, albeit a slow in market growth in recent years. A retrofit technology that would enable these technologies to connect to the Internet of Things, utilize the ever growing potential of data analysis, predictive maintenance or cloud-based application, while at the same time not changing a running system is fundamental.
High mobility, electrolyte-gated transistors (EGTs) show high DC performance at low voltages (< 2 V). To model those EGTs, we have used different models for the below and the above threshold regime with appropriate interpolation to ensure continuity and smoothness over all regimes. This empirical model matches very well with our measured results obtained by the electrical characterization of EGTs.
Today the methods of numerical simulation of sheet metal forming offer a great diversity of possibilities for optimization in product development and in process design. However, the results from simulation are only available as virtual models. Because there are any forming tools available during the early stages of product development, physical models that could serve to represent the virtual results are therefore lacking. Physical 3D-models can be created using 3D-printing and serve as an illustration and present a better understanding of the simulation results. In this way, the results from the simulation can be made more “comprehensible” within a development team. This paper presents the possibilities of 3D-colour printing with particular consideration of the requirements regarding the implementation of sheet metal forming simulation. Using concrete examples of sheet metal forming, the manufacturing of 3D colour models will be expounded upon on the basis of simulation results.
MPC-Workshop Februar 2016
(2016)
Die neueste Generation von programmierbaren Logikbausteinen verfügt neben den konfigurierbaren Logikzellen über einen oder mehrere leistungsfähige Mikroprozessoren. In dieser Arbeit wird gezeigt, wie ein bestehendes Zwei-Chip-System auf einen Xilinx Zynq 7000 mit zwei ARM A9-Cores migriert wird. Bei dem System handelt es sich um das „GPS-gestützte Kreisel-system ADMA“ des Unternehmens GeneSys. Die neue Lösung verbessert den Datenaustausch zwischen dem ersten Mikroprozessor zur digitalen Signalverarbeitung und dem zweiten Prozessor zur Ablaufsteuerung durch ein Shared Memory. Für die schnelle und echtzeitfähige Datenübertragung werden zahlreiche hochbitratige Schnittstellengenutzt.
The Metering Bus, also known as M-Bus, is a European standard EN13757-3 for reading out metering devices, like electricity, water, gas, or heat meters. Although real-life M-Bus networks can reach a significant size and complexity, only very simple protocol analyzers are available to observe and maintain such networks. In order to provide developers and installers with the ability to analyze the real bus signals easily, a web-based monitoring tool for the M-Bus has been designed and implemented. Combined with a physical bus interface it allows for measuring and recording the bus signals. For this at first a circuit has been developed, which transforms the voltage and current-modulated M-Bus signals to a voltage signal that can be read by a standard ADC and processed by an MCU. The bus signals and packets are displayed using a web server, which analyzes and classifies the frame fragments. As an additional feature an oscilloscope functionality is included in order to visualize the physical signal on the bus. This paper describes the development of the read-out circuit for the Wired M-Bus and the data recovery.
In this paper an RFID/NFC (ISO 15693 standard) based inductively powered passive SoC (system on chip) for biomedical applications is presented. A brief overview of the system design, layout techniques and verification method is dis-cussed here. The SoC includes an integrated 32 bit microcontroller, sensor interface circuit, analog to digital converter, integrated RAM, ROM and some other peripherals required for the complete passive operation. The entire chip is realized in CMOS 0.18 μm technology with a chip area of 1.52mm x 3.24 mm.
MPC-Workshop Februar 2015
(2015)
MPC-Workshop Juli 2015
(2015)
Meeting the requirements of smart grids local, decentralized subnets will offer additional potentials to stabilize and compensate the utility grid mainly on the low voltage level. In a quite complex configuration these decentralized energy systems are combined power, heat and cooling power distributions. According to the regional and local availability of renewable energy sources advanced energy management concepts should consider climatic conditions as well as the state of the interacting utility grid and consumption profiles. The approach uses demonstrational setups to develop a forecast based energy management for trigeneration subnets by taking into account the running conditions of local electrical and thermal energy conversion units. This should lead to the best coverage of the demand and supporting/stabilizing the utility grid at the same time. For the first of three demonstrational projects the priorities of the subnet are given with the maximization of the CHP operation to substitute a major part of the heating and cooling power delivered by electric heaters or compression chillers.
Die zunehmende Anzahl von Transistoren mit immer kleineren Strukturgrößen führt zu einer zunehmenden Leistungsaufnahme in modernen Prozessoren. Das gilt insbesondere für High-End Prozessoren, die mit einer hohen Taktfrequenz betrieben werden. Die aufgenommene Leistung wird in Wärme umgewandelt, die in einer Temperaturerhöhung der Prozessoren resultiert. Hohe Betriebstemperaturen verursachen u.a. eine verringerte Rechenleistung, eine kürzere Lebensdauer des Prozessors und höhere Leckströme. Aus diesen Gründen wird aktives, dynamisches thermisches Management immer wichtiger. Dieser Beitrag stellt eine Erweiterung zu dem Standard- Linux-Scheduler in der Kernel-Version 3.0 für eingebettete Systeme vor: einen PID-Regler, der unter Angabe einer Solltemperatur eine dynamische Frequenz- und Spannungsskalierung durchführt. Die Experimente auf dem Freescale LMX6 Quadcore-Prozessor zeigen, dass der PID-Regler die Betriebstemperatur des Prozessors an die Solltemperatur regeln kann. Er ist die Grundlage für eine in Zukunft zu entwickelnde prädiktive Regelung.
MPC-Workshop Februar 2014
(2014)
MPC-Workshop Juli 2014
(2014)
Android is an operating system which was developed for use in smart mobile phones and is the current leader in this market. A lot of efforts are being spent to make Android available to the embedded world, as well. Many embedded systems do not have a local GUI and are therefore called headless devices. This paper presents the results of an analysis of the general suitability of Anroid in headless embedded systems and ponders the advantages and disadvantages. It focuses on the hardware related issues, i.e. to what extent Android supports hardware peripherals normally used in embedded systems.
MPC-Workshop Februar 2013
(2013)
MPC-Workshop Juli 2013
(2013)
Machine-to-machine communication is continuously extending to new application fields. Especially smart metering has the potential to become the first really large-scale M2M application. Although in the future distributed meter devices will be mainly connected via dedicated primary communication protocols, like ZigBee, Wireless
M-Bus or alike, a major percentage of all meters will be connected via point to point communication using GPRS or UMTS platforms. Thus, such meter devices have to be extremely cost and energy efficient, especially if the devices are battery based and powered several years by a single battery. This paper presents the development of an automated measurement unit for power and time, thus energy characteristics can be recorded. The measurement unit includes a hardware platform for the device
under test (DUT) and a database based software environment for a smooth execution and analysis of the measurements.
The research project Ko-TAG [2], as part of the research initiative Ko-FAS [1], funded by the German Ministry of Economics and Technologies (BMWi), deals with the development of a wireless cooperative sensor system that shall pro-vide a benefit to current driver assistance systems (DAS) and traffic safety applications (TSA). The system’s primary function is the localization of vulnerable road users (VRU) e.g. pedestrians and powered two-wheelers, using communication signals, but can also serve as pre-crash (surround) safety system among vehicles. The main difference of this project, compared to previous ones that dealt with this topic, e.g. the AMULETT project, is an underlying FPGA based Hardware-Software co-design. The platform drives a real-time capable communication protocol that enables highly scalable network topologies fulfilling the hard real-time requirements of the single localization processes. Additionally it allows the exchange of further data (e.g. sensor data) to support the accident pre-diction process and the channel arbitration, and thus supports true cooperative sensing. This paper gives an overview of the project’s current system design as well as of the implementations of the key HDL entities supporting the software parts of the communication protocol. Furthermore, an approach for the dynamic reconfiguration of the devices is described, which provides several topology setups using a single PCB design.
MPC-Workshop Februar 2012
(2012)
MPC-Workshop Juli 2012
(2012)
The efficient support of Hardwae-In-theLoop (HIL) in the design process of hardwaresoftware-co-designed systems is an ongoing challenge. This paper presents a network-based integration of hardware elements into the softwarebased image processing tool „ADTF“, based on a high-performance Gigabit Ethernet MAC and a highly-efficient TCP/IP-stack. The MAC has been designed in VHDL. It was verified in a SystemCsimulation environment and tested on several Altera FPGAs.
Schulgebäude im Liegenschaftsbestand vieler Kommunen treten in den vergangenen Jahren immer stärker ins Interesse der Öffentlichkeit. Viele der Gebäude stammen aus den 70-er oder 80-er Jahren und stehen im Rahmen der Bestandserhaltung für Modernisierungsmaßnahmen an. Insbesondere die hohen Betriebskosten für die Heizung hatten bisher Maßnahmen für den winterlichen Wärmeschutz in den Vordergrund gestellt. Die verstärkt auftretenden sommerlichen Extremtemperaturen an Unterrichtstagen der vergangenen Jahre zeigen zudem einen Handlungsbedarf im Bereich des sommerlichen Wärmeschutzes auf. Für die Aufgaben des Gebäudemanagements und zur Umsetzung eines energieeffizienten Gebäudebetriebs zeigen sich immer stärker die Vorteile einer vielseitig einsetzbaren Gebäudeautomation, die über zentrale Stellen des FM (z.B. Technisches Rathaus) zugänglich ist.
MPC-Workshop Februar 2011
(2011)
MPC-Workshop Juli 2011
(2011)
Mit dem Übergang zu immer komplexeren Designs an der Hochschule Offenburg werden DFT-Strukturen wie „Boundary Scan“ und „Scan“ in ASIC-Designs notwendig. Die DFT-Struktur Scan wird hierbei zukünftig bei Implementierung eines speziellen Scan Chain der Core Logic des ASIC-Designs verwendet und danach in der Boundary Scan Architektur integriert.
Zunächst werden die Strukturen im recht einfachen ASIC-Design „Rolling Dice“, entwickelt am IAF der Hochschule Offenburg, implementiert. Nach Verifizierung der Funktionalität der Strukturen durch Emulation erfolgt die Einführung in komplexere ASIC-Design wie Front-End ASIC DQPSK sowie Prozessor-ASIC PDA V.2 (beide ebenfalls entwickelt am IAF der Hochschule Offenburg).
Eine Verifizierung der mit DFT-Strukturen ausgestatteten komplexeren ASIC-Design erfolgt im Rahmen dieser Ausarbeitung nicht, Bezug genommen wird hauptsächlich auf die Einführung der DFT-Strukturen in das ASIC-Design des „Rolling Dice“.
Ein Vergleich von Aufwand gegenüber Nutzen bei Implementierung von DFT-Strukturen in „kleine“ gegenüber „große“ ASIC-Design bildet ein wichtiges Fazit.
Mobile learning (m-learning) can be considered as a new paradigm of e-learning. The developed solution enables the presentation of animations and 3D virtual reality (VR) on mobile devices and is well suited for mobile learning. Difficult relations in physics as well as intricate experiments in optics can be visualised on mobile devices without need for a personal computer. By outsourcing the computational power to a server, the coverage is worldwide.
Tagungsband zum Workshop der Multiprojekt-Chip-Gruppe Baden-Württemberg, Göppingen, 5. Februar 2010
(2010)
Tagungsband zum Workshop der Multiprojekt-Chip-Gruppe Baden-Württemberg, Reutlingen, 9. Juli 2010
(2010)
Der Cache-Speicher für den Softprozessor SIRIUS ist ein 4-fach assoziativer Cache-Speicher, der mit einem DDR-Interface auf einen externen Speicher zugreifen kann. Er verwaltet und beschleunigt Zugriffe vom Prozessor auf diesen Speicher. Der Cache-Speicher arbeitet intern mit 32 Bit und der doppelten Prozessortaktfrequenz und ermöglicht Systeme mit größeren Speicheranforderungen ohne signifikante Performanceverluste. Der Cache-Speicher wurde mit der Hardwarebeschreibungssprache VHDL erstellt und mit dem bestehenden Mikrocontrollersystem verbunden.
Das Gesamtsystem wurde zunächst simuliert und anschließend mit dem Cyclone III FPGA Starter Kit von Altera, welches ein 32 MB DDR-RAM-Modul zur Verfügung stellt, durch Ausführen eines Testprogramms erfolgreich verifiziert. Für den kompletten Cache-Speicher werden inklusive der Pins für den externen Oszillator und des Reset-Tasters 3805 Logik-Zellen, 27 M9K-Blöcke, 44 Pins und eine PLL benötigt.
Im ASIC Design Center der Hochschule Offenburg wird ein Design Kit für die UMC 0.18μm Faraday Technologie aufbereitet. Dabei werden alle benötigten Dateien, welche für einen zunächst rein digitalen Chipentwurf unter Verwendung der Synopsys, Cadence und Mentor Tools benötigt werden, für den UMC 0.18μm Prozess zusammengestellt.
Auf dem Markt existiert eine Vielzahl an PDAs. Alle haben einen sehr hohen Funktionsumfang und übertreffen sich von Generation zu Generation und erfordern einen hohen Entwicklungsaufwand von ganzen Entwicklerteams.
Der in dieser Arbeit entwickelte PDA mit seiner Hard- und Software soll kein Konkurrenzprodukt darstellen, sondern aufzeigen, was mit hausinternen Mitteln der Hochschule Offenburg möglich ist und gegebenenfalls eine Benutzeroberfläche für bestehende oder noch kommende Projekte bilden.
Das hier entstandene Gerät ist im Akkumulator-Betrieb autonom und kann als eigenständiges System betrieben werden. Als Herzstück dient das Softcore SIRIUS Mikroprozessorsystem, das als VHDL-Modell in einem FPGA emuliert wird.
Zum Darstellen des grafischen Betriebsystems, welches speziell für dieses PDA entwickelt wurde, wird ein AMOLED-Display verwendet. Dieses besitzt ein Touchpanel, welches zur Steuerung des Systems genutzt wird. Softwareseitig sind Grundfunktionen zur Darstellung von Bildern und Texten entstanden, sowie Beispielanwendungen, die diese benutzen. Das grafische Betriebssystem ist modular und ermöglicht die direkte Weiterentwicklung von Anwendungen für das System.
Tagungsband zum Workshop der Multiprojekt-Chip-Gruppe Baden-Württemberg, Künzelsau, 6. Februar 2009
(2009)
Tagungsband zum Workshop der Multiprojekt-Chip-Gruppe Baden-Württemberg, Karlsruhe, 10. Juli 2009
(2009)
Den Hauptbestandteil des Operationssystems stellt der Zugriff auf SD-Karten mit dem Dateisystem FAT16 von Microsoft dar. Für die Bedienung wurde ein Kommandozeileninterpreter implementiert. Als Ein- und Ausgabegerät dient ein PC mit einem speziellen Terminalprogramm, welcher über USB mit dem Emulationsboard des SIRIUS Softcores verbunden ist. Das System wird über die Eingabe von Befehlen am Terminal gesteuert.
Der SIRIUS Softcore kann nur vom Flash des Emulationsboards booten. Da das Betriebssystem selbst jedoch auf der SD-Karte gespeichert werden soll, ist ein Basis-Betriebssystem erforderlich, welches im Flash abgelegt ist. Das Basis-Betriebssystem lädt gleich nach dem Start das eigentliche Betriebssystem von der SD-Karte. Falls jedoch keine SD-Karte gesteckt ist, ermöglicht das Basis-Betriebssystem mit einem Kommandozeileninterpreter einige Grundfunktionen.
RFID- Frontend ISO 15693
(2008)
Im Rahmen einer Master Thesis wurde ausgehend von einem vorhandenen System On Chip Design, welches eingehende EKG-Datensignale verarbeitet, das bestehende System so erweitert dass es komplett über den standardisierten SPI-Bus steuerbar und auslesbar ist.
Tagungsband zum Workshop der Multiprojekt-Chip-Gruppe Baden-Württemberg, Konstanz, 4. Juli 2008
(2008)
Electronic pills, smart capsules or miniaturized microsystems swallowed by human beings or animals for various biomedical and diagnostic applications are growing rapidly in the last years. This paper searched out the important existing electronic pills in the market and prototypes in research centers. Further objective of this research is to develop a technology platform with enhanced feature to cover the drawback of most
capsules. The designed telemetry unit is a synchronous bidirectional communication block using continuous phase DQPSK of 115 kHz low carrier frequency for inductive data transmission suited for human body energy transfer. The communication system can assist the electronic pill to trigger an actuator for drug delivery, to record temperature, or to measure pH of the body. It consists additionally to a 32bit processor, memory, external peripheries, and detection facility. The complete system is designed to fit small-size mass medical application with low power consumption, size of 7x25mm. The system is designed, simulated and emulated on FPGA.
MPC-Workshop Februar 2007
(2007)
MPC-Workshop Juli 2007
(2007)