Refine
Year of publication
Document Type
- Conference Proceeding (29)
- Article (reviewed) (19)
- Article (unreviewed) (17)
- Bachelor Thesis (10)
- Master's Thesis (7)
- Doctoral Thesis (5)
- Report (4)
- Letter to Editor (1)
- Study Thesis (1)
Conference Type
- Konferenzartikel (23)
- Konferenz-Abstract (3)
- Konferenz-Poster (1)
- Sonstiges (1)
Keywords
- Deep Leaning (3)
- Deep Learning (3)
- Hochschuldidaktik (3)
- Simulation (3)
- machine learning (3)
- Advanced Footwear Technology (2)
- Alexander von Humboldt (2)
- Artificial Intelligence (2)
- Biomechanics (2)
- Computersicherheit (2)
Institute
- Fakultät Elektrotechnik, Medizintechnik und Informatik (EMI) (ab 04/2019) (31)
- Fakultät Maschinenbau und Verfahrenstechnik (M+V) (28)
- Fakultät Medien (M) (ab 22.04.2021) (19)
- INES - Institut für nachhaltige Energiesysteme (14)
- Fakultät Wirtschaft (W) (12)
- IMLA - Institute for Machine Learning and Analytics (9)
- ivESK - Institut für verlässliche Embedded Systems und Kommunikationselektronik (4)
- IBMS - Institute for Advanced Biomechanics and Motion Studies (ab 16.11.2022) (3)
- ACI - Affective and Cognitive Institute (2)
- IfTI - Institute for Trade and Innovation (2)
Open Access
- Diamond (93) (remove)
In recent years, the topic of embedded machine learning has become very popular in AI research. With the help of various compression techniques such as pruning, quantization and others compression techniques, it became possible to run neural networks on embedded devices. These techniques have opened up a whole new application area for machine learning. They range from smart products such as voice assistants to smart sensors that are needed in robotics. Despite the achievements in embedded machine learning, efficient algorithms for training neural networks in constrained domains are still lacking. Training on embedded devices will open up further fields of applications. Efficient training algorithms would enable federated learning on embedded devices, in which the data remains where it was collected, or retraining of neural networks in different domains. In this paper, we summarize techniques that make training on embedded devices possible. We first describe the need and requirements for such algorithms. Then we examine existing techniques that address training in resource-constrained environments as well as techniques that are also suitable for training on embedded devices, such as incremental learning. At the end, we also discuss which problems and open questions still need to be solved in these areas.
Nowadays decarbonisation of the energy system is one of the main concerns for most governments. Renewable energy technologies, such as rooftop photovoltaic systems and home battery storage systems, are changing the energy system to be more decentralised. As a consequence, new ways of energy business models are emerging, e.g., peer-to-peer energy trading. This new concept provides an online marketplace where direct energy exchange can occur between its participants. The purpose of this study is to conduct a content analysis of the existing literature, ongoing research projects, and companies related to peer-to-peer energy trading. From this review, a summary of the most important aspects and journal papers is assessed, discussed, and classified. It was found that the different energy market types were named in various ways and a proposal for standard language for the several peer-to-peer market types and the different actors involved is suggested. Additionally, by grouping the most important attributes from peer-to-peer energy trading projects, an assessment of the entry barrier and scalability potential is performed by using a characterisation matrix.
Featherweight Generic Go (FGG) is a minimal core calculus modeling the essential features of the programming language Go. It includes support for overloaded methods, interface types, structural subtyping and generics. The most straightforward semantic description of the dynamic behavior of FGG programs is to resolve method calls based on runtime type information of the receiver.
This article shows a different approach by defining a type-directed translation from FGG to an untyped lambda-calculus. The translation of an FGG program provides evidence for the availability of methods as additional dictionary parameters, similar to the dictionary-passing approach known from Haskell type classes. Then, method calls can be resolved by a simple lookup of the method definition in the dictionary.
Every program in the image of the translation has the same dynamic semantics as its source FGG program. The proof of this result is based on a syntactic, step-indexed logical relation. The step-index ensures a well-founded definition of the relation in the presence of recursive interface types and recursive methods.
Recent advances in spiked shoe design, characterized by increased longitudinal stiffness, thicker midsole foams, and reconfigured geometry are considered to improve sprint performance. However, so far there is no empirical data on the effects of advanced spikes technology on maximal sprinting speed (MSS) published yet. Consequently, we assessed MSS via ‘flying 30m’ sprints of 44 trained male (PR: 10.32 s - 12.08 s) and female (PR: 11.56 s - 14.18 s) athletes, wearing both traditional and advanced spikes in a randomized, repeated measures design. The results revealed a statistically significant increase in MSS by 1.21% on average when using advanced spikes technology. Notably, 87% of participants showed improved MSS with the use of advanced spikes. A cluster analysis unveiled that athletes with higher MSS may benefit to a greater extent. However, individual responses varied widely, suggesting the influence of multiple factors that need detailed exploration. Therefore, coaches and athletes are advised to interpret the promising performance enhancements cautiously and evaluate the appropriateness of the advanced spike technology for their athletes critically.
Alexander von Humboldt, a German scientist and explorer of the 19th century, viewed the natural world holistically and described the harmony of nature among the diversity of the physical world as a conjoining between all physical disciplines. He noted in his diary: “Everything is interconnectedness.”
The main feature of Humboldt’s pioneering work was later named “Humboldtian science”, meaning the accurate study of interconnected real phenomena in order to find a definite law and a dynamic cause.
Following Humboldt's idea of nature, an Internet edition of his works must preserve the author’s original intention, retain an awareness of all relevant works, and still adhere to the requirements of scholarly edition.
At the present time, however, the highly unconventional form of his publications has undermined the awareness and a comprehensive study of Humboldt’s works.
Digital libraries should supply dynamic links to sources, maps, images, graphs and relevant texts. New forms of interaction and synthesis between humanistic texts and scientific observation need to be created.
Information technology is the only way to do justice to the broad range of visions, descriptions and the idea of nature of Humboldt’s legacy. It finally leads to virtual research environments as an adequate concept to redesign our digital archives, not only for Humboldt’s documents, but for all interconnected data.
Due to its performance, the field of deep learning has gained a lot of attention, with neural networks succeeding in areas like Computer Vision (CV), Neural Language Processing (NLP), and Reinforcement Learning (RL). However, high accuracy comes at a computational cost as larger networks require longer training time and no longer fit onto a single GPU. To reduce training costs, researchers are looking into the dynamics of different optimizers, in order to find ways to make training more efficient. Resource requirements can be limited by reducing model size during training or designing more efficient models that improve accuracy without increasing network size.
This thesis combines eigenvalue computation and high-dimensional loss surface visualization to study different optimizers and deep neural network models. Eigenvectors of different eigenvalues are computed, and the loss landscape and optimizer trajectory are projected onto the plane spanned by those eigenvectors. A new parallelization method for the stochastic Lanczos method is introduced, resulting in faster computation and thus enabling high-resolution videos of the trajectory and secondorder information during neural network training. Additionally, the thesis presents the loss landscape between two minima along with the eigenvalue density spectrum at intermediate points for the first time.
Secondly, this thesis presents a regularization method for Generative Adversarial Networks (GANs) that uses second-order information. The gradient during training is modified by subtracting the eigenvector direction of the biggest eigenvalue, preventing the network from falling into the steepest minima and avoiding mode collapse. The thesis also shows the full eigenvalue density spectra of GANs during training.
Thirdly, this thesis introduces ProxSGD, a proximal algorithm for neural network training that guarantees convergence to a stationary point and unifies multiple popular optimizers. Proximal gradients are used to find a closed-form solution to the problem of training neural networks with smooth and non-smooth regularizations, resulting in better sparsity and more efficient optimization. Experiments show that ProxSGD can find sparser networks while reaching the same accuracy as popular optimizers.
Lastly, this thesis unifies sparsity and neural architecture search (NAS) through the framework of group sparsity. Group sparsity is achieved through ℓ2,1-regularization during training, allowing for filter and operation pruning to reduce model size with minimal sacrifice in accuracy. By grouping multiple operations together, group sparsity can be used for NAS as well. This approach is shown to be more robust while still achieving competitive accuracies compared to state-of-the-art methods
Go ist eine 2009 veröffentlichte Programmiersprache mit einem statischen Typsystem. Seit Version 1.18 sind auch Generics ein Teil der Sprache. Deren Übersetzung wurde im de facto Standard-Compiler mittels Monomorphisierung umgesetzt. Diese bringt neben einigen Vorteilen auch Nachteile mit sich. Aus diesem Grund beschäftigt sich diese Arbeit mit einer alternativen Übersetzungsstrategie für Generics in Go und implementiert diese in einem neuen Compiler für Featherweight Generic Go, einem Subset von Go. Zum Schluss steht damit ein nahezu funktionierender Compiler, welcher schließlich Racket-Code ausgibt. Eine Evaluierung der Performanz der Übersetzungsstrategie ist allerdings noch ausstehend.
This paper has the objective of creating a framework for a different cultural dimension of corporate entrepreneurship leading to corporate entrepreneurial culture (CEC). The analysis of CEC is based on a review of existing concepts of organisational culture and entrepreneurship. They are combined to create a framework of CEC, including macro- and microlevels and examples of subcultures. Core ideas of the framework are validated by qualitative interviews with ten experts. The identified organisational category of the CEC framework is defined by the levels of micro-cultures or subcultures and includes the upper levels of the hierarchy, including the industry level. Geographic categories such as regional or national culture are also part of the system. The individual category of the CEC framework is characterised by competencies (including aspects such as motivation, creativity, mobilising others, coping with uncertainty, teamwork and social competencies) and entrepreneurial personalities. The results of the interviews show the importance of these individual competencies for a lively CEC. The different levels, such as national and professional cultures, as a dimension of the organisational category of the framework are also confirmed by the interviews. The findings indicate that the individual category of CEC could be used for job satisfaction or engagement and the degree of CEC of an organisation could be defined and developed by the organisational category. The identified framework contributes to an understanding of this complex topic and supports companies in the implementation of entrepreneurial ideas in different organisational contexts.
Total Cost of Ownership (TCO) is a key tool to have a complete understanding of the costs associated with an investment, as it allows to analyze not only the initial acquisition costs, but also the long-term costs related to operation, maintenance, depreciation, and other factors. In the context of the cement industry, TCO is especially important due to the complexity of the production processes and the wide variety of components and machinery involved in the process.
For this reason, a TCO analysis for the cement industry has been conducted in this study, with the objective of showing the different components of the cost of production. This analysis will allow the reader to gain knowledge about these costs, in the industrial model will be to make informed decisions on the adoption of technologies and practices that will allow them to reduce costs in the long run and improve their operational efficiency.
In particular, this study pursues to give visibility to technologies and practices that enable the reduction of carbon emissions in cement production, thus contributing to the sustainability of industry and the protection of the environment. By being at the forefront of sustainability issues, the cement industry can contribute to the achievement of environmentally friendly technologies and enable the development of people and industry.
The Oxyfuel technology has been selected as a carbon capture solution for the cement industry due to its practical application, low costs, and practical adaptation to non-capture processes. The adoption of this technology allows for a significant reduction in CO2 emissions, which is a crucial factor in achieving sustainability in the cement manufacturing process.
Carbon capture storage technologies represent a high investment, although these technologies increase the cost of production, the application of Oxyfuel technology is one of the most economically viable as the cheapest technology per capture according to the comparison. However, this price increase is a technical advantage as the carbon capture efficiency of this technology reaches 90%. This level of efficiency leads to a decrease in taxes for the generation of CO2 emissions, making the cement manufacturing process sustainable.
Auswirkung eines Importstopps russischer Energieträger auf die Klimaschutzziele in Deutschland
(2022)
Ein Importstopp russischer Energieträger nach Deutschland wird derzeit vermehrt diskutiert. Wir wollen die Diskussion unterstützen, indem wir einen Weg zeigen, wie das Elektrizitätssystem in Deutschland kurzfristig mit geringen Energieimporten auskommt und welche Maßnahmen notwendig sind, um die Klimaschutzziele trotzdem einzuhalten. Die Ergebnisse eines solchen Energiewendeszenarios mit reduzierter Importabhängigkeit werden mit dem Energiesystemmodell MyPyPSA-Ger berechnet. Die wichtigsten Erkenntnisse sind, dass ein zügiger Ausbau Erneuerbarer Energien und von Speichertechnologien • die Abhängigkeit des deutschen Elektrizitätssystems von Energieimporten deutlich reduziert. • auch langfristig keine wesentlichen Importe der Energieträger Erdgas, Steinkohle und Mineralöl nach sich zieht. • über die Klimaziele der Bundesregierung hinaus das 1,5-Grad-Ziel im Elektrizitätssystem erreicht wird.
Automatic Identification of Travel Locations in Rare Books - Object Oriented Information Management
(2017)
The digital content of the Internet is growing exponentially and mass digitization of printed media opens access to literature, in particular the genre of travel literature from the 18th and 19th century, which consists of diaries or travel books describing routes, observations or inspirations. The identification of described locations in the digital text is a long-standing challenge which requires information technology to supply dynamic links to sources by new forms of interaction and synthesis between humanistic texts and scientific observations.
Using object oriented information technology, a prototype of a software tool is developed which makes it possible to automatically identify geographic locations and travel routes mentioned in rare books. The information objects contain properties such as names and classification codes for populated places, streams, mountains and regions. Together, with the latitudes and longitudes of every single location, it is possible to geo-reference this information in order that all processed and filtered datasets can be displayed by a map application. This method has already been used in the Humboldt Digital Library to present Alexander von Humboldt’s maps and was tested in a case study to prove the correctness and reliability of the automatic identification of locations based on the work of Alexander von Humboldt and Johann Wolfgang von Goethe.
The results reveal numerous errors due to misspellings, change of location names, equality of terms and location names. But on the other hand it becomes very clear that results of the automatic object detection and recognition can be improved by error-free and comprehensive sources. As a result an increase in quality and usability of the service can be expected, accompanied by more options to detect unknown locations in the descriptions of rare books.
We have developed a methodology for the systematic generation of a large image dataset of macerated wood references, which we used to generate image data for nine hardwood genera. This is the basis for a substantial approach to automate, for the first time, the identification of hardwood species in microscopic images of fibrous materials by deep learning. Our methodology includes a flexible pipeline for easy annotation of vessel elements. We compare the performance of different neural network architectures and hyperparameters. Our proposed method performs similarly well to human experts. In the future, this will improve controls on global wood fiber product flows to protect forests.
In this paper, we study the runtime performance of symmetric cryptographic algorithms on an embedded ARM Cortex-M4 platform. Symmetric cryptographic algorithms can serve to protect the integrity and optionally, if supported by the algorithm, the confidentiality of data. A broad range of well-established algorithms exists, where the different algorithms typically have different properties and come with different computational complexity. On deeply embedded systems, the overhead imposed by cryptographic operations may be significant. We execute the algorithms AES-GCM, ChaCha20-Poly1305, HMAC-SHA256, KMAC, and SipHash on an STM32 embedded microcontroller and benchmark the execution times of the algorithms as a function of the input lengths.
The variable refrigerant flow system is one of the best heating, ventilation, and air conditioning systems (HVAC) thanks to its ability to provide thermal comfort inside buildings. But, at the same time, these systems are considered one of the most energy-consuming systems in the building sector. Thus, it is crucial to well size the system according to the building’s cooling and heating needs and the indoor temperature fluctuations. Although many researchers have studied the optimization of the building energy performance considering heating or cooling needs, using air handling units, radiant floor heating, and direct expansion valves, few studies have considered the use of multi-objective optimization using only the thermostat setpoints of VRF systems for both cooling and heating needs. Thus, the main aim of this study is to conduct a sensitivity analysis and a multi-objective optimization strategy for a residential building containing a variable refrigerant flow system, to evaluate the effect of the building performance on energy consumption and improve the building energy efficiency. The numerical model was based on the EnergyPlus, jEPlus, and jEPlus+EA simulation engines. The approach used in this paper has allowed us to reach significant quantitative energy saving by varying the cooling and heating setpoints and scheduling scenarios. It should be stressed that this approach could be applied to several HVAC systems to reduce energy-building consumption.
Das Thema dieser Masterthesis lautet „Camera Stream Solution – Marktübersicht, Lösungsansätze, Prototyp“. Mit dieser Arbeit wird eine Videostreaming-Lösung für die Herrenknecht-Plattform CONNECTED realisiert. Dabei geht es um die Bildschirmaufnahme von Navigations- und Steuerungsbildschirmen auf Tunnelbohrmaschinen und die Übertragung dieser Aufnahmen in die Cloud. Letztlich wird ermöglicht die Aufnahmen in nahezu Echtzeit als Videostream in einem Videoplayer wiederzugeben.
Zu Beginn werden die Grundlagen zur Datenübertragung im Internet sowie zum Streaming erläutert. Im Anschluss wird eine Marktübersicht verschiedener Streaming-Komponenten gegeben sowie einige Lösungsansätze vorgestellt und anhand ausgewählter Kriterien verglichen. Im nächsten Schritt wird die Implementierung eines Prototyps behandelt. Dieser nutzt unter anderem ffmpeg für die Bildschirmaufnahme und die Kodierung sowie die Streaming-Protokolle RTMP (Real Time Messaging Protocol) und HLS (HTTP Live Streaming). Zur Realisierung der Architektur gehört auch die Entwicklung einer REST-API und eines REST-Clients in C#.
Mit dem Projekt wird eine „echte“ Streaming-Lösung für die Kundenplattform CONNECTED entwickelt, die einen Videostream mit 24 Bildern pro Sekunde bietet, um die bisherige Darstellung von Screenshots auf der Plattform zu ersetzen.
CNN-based deep learning models for disease detection have become popular recently. We compared the binary classification performance of eight prominent deep learning models: DenseNet 121, DenseNet 169, DenseNet 201, EffecientNet b0, EffecientNet lite4, GoogleNet, MobileNet, and ResNet18 for their binary classification performance on combined Pulmonary Chest Xrays dataset. Despite the widespread application in different fields in medical images, there remains a knowledge gap in determining their relative performance when applied to the same dataset, a gap this study aimed to address. The dataset combined Shenzhen, China (CH) and Montgomery, USA (MC) data. We trained our model for binary classification, calculated different parameters of the mentioned models, and compared them. The models were trained to keep in mind all following the same training parameters to maintain a controlled comparison environment. End of the study, we found a distinct difference in performance among the other models when applied to the pulmonary chest Xray image dataset, where DenseNet169 performed with 89.38 percent and MobileNet with 92.2 percent precision.
Robust scheduling problem is a major decision problem that is addressed in the literature, especially for remanufacturing systems; this problem is complex because of the high uncertainty and complex constraints involved. Generally, the existing approaches are dedicated to specific processes and do not enable the quick and efficient generation and evaluation of schedules. With the emergence of the Industry 4.0 paradigm, data availability is now considered an opportunity to facilitate the decision-making process. In this study, a data-driven decisionmaking process is proposed to treat the robust scheduling problem of remanufacturing systems in uncertain environments. In particular, this process generates simulation models based on a data-driven modeling approach. A robustness evaluation approach is proposed to answer several decision questions. An application of the decision process in an industrial case of a remanufacturing system is presented herein, illustrating the impact of robustness evaluation results on real-life decisions.
DE\GLOBALIZE
(2022)
The artistic research cycle DE\GLOBALIZE is a media ecological search movement for the terrestrial. After examining matters of fact in India (2014-18), matters of concern in Egypt (2016-2019) and matters of care in the Upper Rhine (2018-22), the focus turns toward matters of violence in the Congo (2022). From matter to mater, mother-earth, the garden to exploitation. From science, water and climate to migration, oppression and extermination.
The long-term research is accessible through interactive web documentation. The platform serves as a continuous media-archaeological archive for a speculative ethnography. The relational structure of the videographic essay is enabling the forensic processing of single documents in the sense of the actor-network theory.
The subject of the presentation at IFM is a field trip to the Congo planned for March 2022, which will focus on the ambivalence of violence and care in collaboration with local artists. The field trip is based on the postcolonial reflection luderitzcargo by the author from 1996, in which a freight container was transformed into a translocal cinema in Namibia.
Through the journey to Congo, a group of media artists, a psychotherapist, a theater dramaturg, a filmmaker and a philosopher intend to explore the political, technological and psycho-geographic borders. By artistic interventions with locals, we want to interfere with relational string figures as part of the new Earth Politics. They are focusing on the displaced consumption of resources which are hard-fought and guarantee prosperity in the global north. The so-called ghost acreages are repressed and justified as part of a civilizational mission. With this trip, we want to confront our self-lies with the ones of our hosts. We want to confront ourselves with the foreign, the dark and the displaced ghosts within ourselves. In the presentation at the #IFM2022 Conference, the platform DE\GLOBALIZE will be problematized itself as an example of epistemic violence for the ethnographic memory of (Western) knowledge.
We are not the missionaries but the perplexed travellers. In our search movement, we are dealing with psychoanalysis, video, performance and trance. As disoriented white men we try the reversal of Black Skin and White Mask by Franz Fanon without blackfacing. We will not only care about the sensitivity of our skin but that of our g/hosts and the one of mother earth.
Decarbonisation Strategies in Energy Systems Modelling: APV and e-tractors as Flexibility Assets
(2023)
This work presents an analysis of the impact of introducing Agrophotovoltaic technologies and electric tractors into Germany’s energy system. Agrophotovoltaics involves installing photovoltaic systems in agricultural areas, allowing for dual usage of the land for both energy generation and food production. Electric tractors, which are agricultural machinery powered by electric motors, can also function as energy storage units, providing flexibility to the grid. The analysis includes a sensitivity study to understand how the availability of agricultural land influences Agrophotovoltaic investments, followed by the examination of various scenarios that involve converting diesel tractors to electric tractors. These scenarios are based on the current CO2 emission reduction targets set by the German Government, aiming for a 65% reduction below 1990 levels by 2030 and achieving zero emissions by 2045. The results indicate that approximately 3% of available agricultural land is necessary to establish a viable energy mix in Germany. Furthermore, the expansion of electric tractors tends to reduce the overall system costs and enhances the energy-cost-efficiency of Agrophotovoltaic investments.
The identification of vulnerabilities is an important element in the software development life cycle to ensure the security of software. While vulnerability identification based on the source code is a well studied field, the identification of vulnerabilities on basis of a binary executable without the corresponding source code is more challenging. Recent research has shown, how such detection can be achieved by deep learning methods. However, that particular approach is limited to the identification of only 4 types of vulnerabilities. Subsequently, we analyze to what extent we could cover the identification of a larger variety of vulnerabilities. Therefore, a supervised deep learning approach using recurrent neural networks for the application of vulnerability detection based on binary executables is used. The underlying basis is a dataset with 50,651 samples of vulnerable code in the form of a standardized LLVM Intermediate Representation. The vectorised features of a Word2Vec model are used to train different variations of three basic architectures of recurrent neural networks (GRU, LSTM, SRNN). A binary classification was established for detecting the presence of an arbitrary vulnerability, and a multi-class model was trained for the identification of the exact vulnerability, which achieved an out-of-sample accuracy of 88% and 77%, respectively. Differences in the detection of different vulnerabilities were also observed, with non-vulnerable samples being detected with a particularly high precision of over 98%. Thus, the methodology presented allows an accurate detection of 23 (compared to 4) vulnerabilities.
Additive manufacturing enables the production of lightweight and resilient components with extensive design freedom. In the low-cost sector, material extrusion (e.g. Fused Deposition Modeling - FDM) has been the main method used to date. Thus, robust 3D printers and inexpensive 3D materials (polymer filaments) can be used. However, the printing times for FDM are very long and the quality of the dimensions and surfaces is limited. Recently, new processes from the field of Vat polymerization have entered the market. For example, masked stereolithography (mSLA) offers a significant improvement in component quality and build speed through the use of resins and large-area curing at still reasonable costs. Currently, there is only limited knowledge available on the optimal design of components using this young process. In this contribution, design guidelines are developed to determine the possibilities and limitations of mSLA from a design point of view. For this purpose, a number of test geometries are designed and investigated to obtain systematic insights into important design features, such as wall thickness, grooves and holes. In addition, typical problems in additive manufacturing, such as the design of overhangs and fits or the hollowing of components, are investigated. The evaluation of practical 3D printing tests thus provides important parameters that can be transferred to design guidelines of components for additive manufacturing using mSLA.
Socially assistive robots (SARs) are becoming more prevalent in everyday life, emphasizing the need to make them socially acceptable and aligned with users' expectations. Robots' appearance impacts users' behaviors and attitudes towards them. Therefore, product designers choose visual qualities to give the robot a character and to imply its functionality and personality. In this work, we sought to investigate the effect of cultural differences on Israeli and German designers' perceptions and preferences regarding the suitable visual qualities of SARs in four different contexts: a service robot for an assisted living/retirement residence facility, a medical assistant robot for a hospital environment, a COVID-19 officer robot, and a personal assistant robot for domestic use. Our results indicate that Israeli and German designers share similar perceptions of visual qualities and most of the robotics roles. However, we found differences in the perception of the COVID-19 officer robot's role and, by that, its most suitable visual design. This work indicates that context and culture play a role in users' perceptions and expectations; therefore, they should be taken into account when designing new SARs for diverse contexts.
This work addresses the conceptualization, design, and implementation of an Application Programming Interface (API) for the Common Security Advisory Framework (CSAF) 2.0, introducing another method for distributing CSAF documents in addition to two already existing methods. These don't allow the use of flexible queries as well as filtering, which makes it difficult for operators of software and hardware to use CSAF. An API is intended to simplify this process and thus advance the automation goal of CSAF.
First, it is evaluated whether the current standard allows the implementation of an API. Any conflicts are highlighted and suggestions for standard adaptations are made. Based on these results, the API is designed to meet the previously defined requirements. Subsequently, a proof of concept is successfully developed according to the design and extensively tested with specially prepared test data. Finally, the results and the necessary standard adjustments are summarized and justified.
The conceptual design and the implementation were successfully completed. However, during the implementation of the proof of concept, some routes could not be fully implemented.
In der Geschichte »Die Schule« (Originaltitel: ,,The fun they had“) von 1954 beschreibt der russisch-amerikanische Wissenschaftler und Science fiction Autor Isaac Asimov, wie die Schule der Zukunft im Jahr 2157 aussieht – oder genauer: dass es gar keine Schulen mehr gibt. Jedes Kind hat neben seinem Kinderzimmer im Elternhaus einen kleinen Schulraum, in dem es von einem mechanischen Lehrer (einer Maschine mit Bildschirm und einem Schlitz zum Einwerfen der Hausaufgaben) unterrichtet wird. Diese Lehrmaschine ist perfekt auf die Fähigkeiten des einzelnen Kindes eingestellt und kann es optimal beschulen. Nur: Maschinen können kaputt gehen. Die elfjährige Margie wird von ihrem mechanischen Lehrer wieder und wieder in Geographie abgefragt, aber jedes Mal schlechter benotet. Das sieht die Mutter und ruft den Schulinspektor, um den mechanischen Lehrer zu reparieren.
High-tech running shoes and spikes ("super-footwear") are currently being debated in sports. There is direct evidence that distance running super shoes improve running economy; however, it is not well established to which extent world-class performances are affected over the range of track and road running events.
This study examined publicly available performance datasets of annual best track and road performances for evidence of potential systematic performance effects following the introduction of super footwear. The analysis was based on the 100 best performances per year for men and women in outdoor events from 2010 to 2022, provided by the world governing body of athletics (World Athletics).
We found evidence of progressing improvements in track and road running performances after the introduction of super distance running shoes in 2016 and super spike technology in 2019. This evidence is more pronounced for distances longer than 1500 m in women and longer than 5000 m in men. Women seem to benefit more from super footwear in distance running events than men.
While the observational study design limits causal inference, this study provides a database on potential systematic performance effects following the introduction of super shoes/spikes in track and road running events in world-class athletes. Further research is needed to examine the underlying mechanisms and, in particular, potential sex differences in the performance effects of super footwear.
When people with hearing loss are provided with different devices in each ear, these devices usually have different processing latencies. This leads to static temporal offsets between both ears in the order of several milliseconds. This thesis measured effects of such offsets in stimulation timing on mechanisms of binaural hearing, such as sound localization and speech understanding in noise in hearing-impaired and normal-hearing listeners.
Die Visualisierung von Programmabläufen ist ein zentraler Aspekt für Programmieranfänger, um das Verständnis von Codeabläufen zu erleichtern und den Einstieg in der Softwareentwicklung zu unterstützen. In dieser Masterthesis wird ein speziell auf die Bedürfnisse von Einsteigern zugeschnittenes generisches Framework vorgestellt, wobei der Fokus auf einer einfachen, verständlichen aber auch korrekten Darstellung der Programmausführung liegt. Das Framework integriert das Debugger Adapter Protocol, um den Debugger unterschiedlicher Sprachen ansprechen und verwenden zu können.
In dieser Arbeit werden zunächst die Anforderungen für das generische Framework diskutiert. Anschließend werden bestehende Ansätze zur Visualisierung von Programmabläufen ausführlich untersucht und analysiert. Die Implementierung des Frameworks wird daraufhin detailliert beschrieben, wobei besonderer Wert auf die Erweiterbarkeit unterschiedlicher Sprachen gelegt wird.
Um die Eignung des Frameworks zu evaluieren, werden mehrere Aufgaben aus dem ersten Modul mit der jeweiligen Programmiersprache des Studiengangs Angewandte Informatik der Hochschule Offenburg betrachtet. Die Ergebnisse zeigen, dass das Framework mit den Aufgaben umgehen und diese korrekt und verständlich darstellen kann.
Electrochemical pressure impedance spectroscopy (EPIS) is an emerging tool for the diagnosis of polymer electrolyte membrane fuel cells (PEMFC). It is based on analyzing the frequency response of the cell voltage with respect to an excitation of the gas-phase pressure. Several experimental studies in the past decade have shown the complexity of EPIS signals, and so far there is no agreement on the interpretation of EPIS features. The present study contributes to shed light into the physicochemical origin of EPIS features, by using a combination of pseudo-two-dimensional modeling and analytical interpretation. Using static simulations, the contributions of cathode equilibrium potential, cathode overpotential, and membrane resistance on the quasi-static EPIS response are quantified. Using model reduction, the EPIS responses of individual dynamic processes are predicted and compared to the response of the full model. We show that the EPIS signal of the PEMFC studied here is dominated by the humidifier. The signal is further analyzed by using transfer functions between various internal cell states and the outlet pressure excitation. We show that the EPIS response of the humidifier is caused by an oscillating oxygen molar fraction due to an oscillating mass flow rate.
Electrochemical pressure impedance spectroscopy (EPIS) has received the attention of researchers as a method to study mass transport processes in polymer electrolyte mem-brane fuel cells (PEMFC). It is based on analyzing the cell voltage response to a harmonic excitation of the gas phase pressure in the frequency domain. Several experiments with a single-cell fuel cell have shown that the spectra contain information in the frequency range typical for mass transport processes and are sensitive to specific operating condi-tions and structural fuel cell parameters. To further benefit from the observed features, it is essential to identify why they occur, which to date has not yet been accomplished. The aim of the present work, therefore, is to identify causal links between internal processes and the corresponding EPIS features.
To this end, the study follows a model-based approach, which allows the analysis of inter-nal states that are not experimentally accessible. The PEMFC model is a pseudo-2D model, which connects the mass transport along the gas channel with the mass transport through the membrane electrode assembly. A modeling novelty is the consideration of the gas vol-ume inside the humidifier upstream the fuel cell inlet, which proves to be crucial for the reproduction of EPIS. The PEMFC model is parametrized to a 100 cm² single cell of the French project partner, who provided the experimental EPIS results reproduced and in-terpreted in the present study.
The simulated EPIS results show a good agreement with the experiments at current den-sities ≤ 0.4 A cm–2, where they allow a further analysis of the observed features. At the lowest excitation frequency of 1 mHz, the dynamic cell voltage response approaches the static pressure-voltage response. In the simulated frequency range between 1 mHz – 100 Hz, the cell voltage oscillation is found to strongly correlate with the partial pressure oscillation of oxygen, whereas the influence of the water pressure is limited to the low frequency region.
The two prominent EPIS features, namely the strong increase of the cell voltage oscillation and the increase of phase shift with frequency, can be traced back via the oxygen pressure to the oscillation of the inlet flow rate. The phenomenon of the oscillating inlet flow rate is a consequence of the pressure change of the gas phase inside the humidifier and in-creases with frequency. This important finding enables the interpretation of experimen-tally observed EPIS trends for a variation of operational and structural fuel cell parame-ters by tracing them back to the influence of the oscillating inlet flow rate.
The separate simulation of the time-dependent processes of the PEMFC model through model reduction shows their individual influence on EPIS. The sluggish process of the wa-ter uptake by the membrane is visible below 0.1 Hz, while the charge and discharge of the double layer becomes visible above 1 Hz. The gas transport through the gas diffusion layer is only visible above 100 Hz. The simulation of the gas transport through the gas channel
without consideration of the humidifier becomes visible above 1 Hz. With consideration of the humidifier the gas transport through the gas channel is visible throughout the fre-quency range. The strong similarity of the spectra considering the humidifier with the spectra of the full model setup shows the dominant influence of the humidifier on EPIS.
A promising observation is the change in the amplitude relationship between the cell volt-age and the oxygen partial pressure oscillation as a function of the oxygen concentration in the catalyst layer. At a frequency where the influence of oxygen pressure on the cell voltage is dominant, for example at 1 Hz, the amplitude of the cell voltage oscillation could be used to indirectly measure the oxygen concentration in the catalyst layer.
Enhancing engineering creativity with automated formulation of elementary solution principles
(2023)
The paper describes a method for the automated formulation of elementary creative stimuli for product or process design at different levels of abstraction and in different engineering domains. The experimental study evaluates the impact of structured automated idea generation on inventive thinking in engineering design and compares it with previous experimental studies in educational and industrial settings. The outlook highlights the benefits of using automated ideation in the context of AI-assisted invention and innovation.
This thesis evaluates and compares current Full-Stack JavaScript Technologies. Through extensive research on the state of the art of JavaScript and its related frameworks, different aspects of FullStack Development are analysed to judge the popularity of technologies.
The language JavaScript and the idea of Full-Stack Development are presented with the functionality of different frameworks. The JavaScript runtime Node.js was examined and marked as the most influential JavaScript technology, which opened up many opportunities.
As technology stacks MERN, MEAN and MEVN were investigated, featuring the base technologies Node.js, MongoDB and Express.js. It was discovered that front-end frameworks have the most influence on which variant of Full-Stack can be chosen. Comparison criteria between the technology stacks were the learning curve, the maintainability, modularity and media integration. These criteria were extracted from research and a questionnaire conducted with students of the University of Applied Sciences Offenburg.
For the purposes of testing and experiencing a Full-Stack JavaScript application, the game RemArrow, based on the 1979s game Simon, was designed and implemented. The comparison with predefined criteria shows the result that the MERN stack with React.js is the best to learn and promises the most potential. Arising JavaScript technologies and their popularity are very dependent on the industry and skill set of the developer.
In conclusion, it can be established that the concept of Full-Stack Development is currently very interesting and more than just a trend. It has potential of becoming a new kind of web development, and part of the curriculum taught at universities. Expert knowledge is needed but there is a high demand and much potential for Full-Stack JavaScript Developers.
The increasing diffusion of rapidly developing AI technologies led to the idea of the experiment to combine TRIZ-based automated idea generation with the natural language processing tool ChatGPT, using the chatbot to interpret the automatically generated elementary solution principles. The article explores the opportunities and benefits of a novel AI-enhanced approach to teaching systematic innovation, analyses the learning experience, identifies the factors that affect students' innovation and problem-solving performance, and highlights the main difficulties students face, especially in interdisciplinary problems.
In this paper, we describe a first publicly available fine-grained product recognition dataset based on leaflet images. Using advertisement leaflets, collected over several years from different European retailers, we provide a total of 41.6k manually annotated product images in 832 classes. Further, we investigate three different approaches for this fine-grained product classification task, Classification by Image, by Text, as well as by Image and Text. The approach "Classification by Text" uses the text extracted directly from the leaflet product images. We show, that the combination of image and text as input improves the classification of visual difficult to distinguish products. The final model leads to an accuracy of 96.4% with a Top-3 score of 99.2%. We release our code at https://github.com/ladwigd/Leaflet-Product-Classification.
Vorhofflimmern ist die häufigste tachykarde Herzrhythmusstörung weltweit. Dabei verliert das Herz seinen normofrequenten Sinusrhythmus und schlägt nicht mehr regelmäßig, sondern zu schnell und unregelmäßig. Vorhofflimmern ist normalerweise keine lebensbedrohliche Herzrhythmusstörung, aber es kann zu einem Schlaganfall führen. Die Ursache dieser Herzrhythmusstörung sind die Kreisende bzw. die fokalen Erregungen im linken Atrium, die hauptsächliche aus einer oder mehreren Pulmonalvenen kommen. Die übliche Therapieverfahren des Vorhofflimmerns ist die Pulmonalvenenisolation.
Diese Bachelorthesis beschäftigt sich daher mit der Modellierung unterschiedlicher linksatrialer Fokus-Modelle und intrakardialer Elektrodenkatheter für die Diagnostik und Terminierung von Vorhofflimmern mittels Pulmonalvenenisolation im Offenburger Herzrhythmusmodell nach Schalk, Krämer und Benke, welches in CST
Studio Suite realisiert wurde.
Zu Beginn wurden die verschiedenen linksatrialen fokalen Flimmerquellen modelliert und daraufhin simuliert. Hierbei wurde jeweils eine Simulation mit linksatrialen fokalen Flimmerquellen, die aus einzelnen, dualen oder allen vier Pulmonalvenen kommen, durchgeführt. Es wurde ebenfalls eine weitere Simulation mit Biosignalen (aus der Realität) erstellt. Mit diesen Simulationen konnte nun der elektrische Erregungsablauf sichtbar gemacht werden. Daraufhin wurden die Katheter für die Diagnostik und für die Pulmonalvenenisolation modelliert und in das bestehende Offenburger Herzrhythmusmodell integriert. Bei den Diagnostik-Kathetern handelte es sich um 10-polige Lasso® Katheter, zwei Varianten von PentaRay® NAV eco Katheter und 4-polige Diagnostik-Katheter „OSYPKA FINDER pure®“. Ablationskatheter sind zwei Varianten von Pentaspline Basket pose Katheter und HELIOSTAR™ Ablation Ballon. Abschließend wurden verschiedene Varianten von Isolationsverfahren der Pulmonalvenen modelliert und daraufhin die linksatrialen fokalen Flimmerquellen nach der Isolation der Pulmonalvenen simuliert.
Gamification wird in vielen Bereichen, die auch den Bildungssektor einschließen, zur Motivations- und Leistungssteigerung eingesetzt. Dieser Beitrag beschreibt das Design, die Umsetzung und Evaluierung eines Gamification-Konzeptes für die Vorlesung „Software Engineering" an der Hochschule Offenburg. Gamification soll nach Intention der Lehrenden eine kontinuierliche und tiefergehende Auseinandersetzung mit den Themen der Vorlesung forcieren sowie einen positiven Einfluss auf die Motivation der Studierenden haben, um den Lernprozess zu unterstützen. Zentral für das Gamification-Design sind dabei eine freiwillige Teilnahme, die Wahrnehmung der Bedeutung der Lerninhalte und ein zielorientierter Einsatz von Gamification-Elementen. Das entwickelte Konzept wurde in der Lernplattform Moodle realisiert, über drei Semester eingesetzt und parallel evaluiert. Die Ergebnisse dieser Evaluierungen zeigen, dass die Studierenden den gamifizierten Kurs intensiv und oft über das gesamte Semester nutzten und aus eigenem Antrieb eine Vielzahl von Übungen absolvierten.
Garbage in, Garbage out: How does ambiguity in data affect state-of-the-art pedestrian detection?
(2024)
This thesis investigates the critical role of data quality in computer vision, particularly in the realm of pedestrian detection. The proliferation of deep learning methods has emphasised the importance of large datasets for model training, while the quality of these datasets is equally crucial. Ambiguity in annotations, arising from factors like mislabelling, inaccurate bounding box geometry and annotator disagreements, poses significant challenges to the reliability and robustness of the pedestrian detection models and their evaluation. This work aims to explore the effects of ambiguous data on model performance with a focus on identifying and separating ambiguous instances, employing an ambiguity measure utilizing annotator estimations of object visibility and identity. Through accurate experimentation and analysis, trade-offs between data cleanliness and representativeness, noise removal and retention of valuable data emerged, elucidating their impact on performance metrics like the log average miss-rate, recall and precision. Furthermore, a strong correlation between ambiguity and occlusion was discovered with higher ambiguity corresponding to greater occlusion prevalence. The EuroCity Persons dataset served as the primary dataset, revealing a significant proportion of ambiguous instances with approximately 8.6% ambiguity in the training dataset and 7.3% in the validation set. Results demonstrated that removing ambiguous data improves the log average miss-rate, particularly by reducing the false positive detections. Augmentation of the training data with samples from neighbouring classes enhanced the recall but diminished precision. Error correction of wrong false positives and false negatives significantly impacts model evaluation results, as evidenced by shifts in the ECP leaderboard rankings. By systematically addressing ambiguity, this thesis lays the foundation for enhancing the reliability of computer vision systems in real-world applications, motivating the prioritisation of developing robust strategies to identify, quantify and address ambiguity.
Die Corona-Semester erforderten die Übertragung der Brückenkurse Mathematik in ein digitales Lehr-format. Gerade beim Studieneinstieg spielen persönliche Unterstützung und soziale Eingebundenheit für Studierende eine besonders wichtige Rolle. Deshalb lag die besondere Herausforderung bei der Übertragung in ein digitales Format darin, die wegfallenden üblichen Kennenlern- und Kommunika-tionsmöglichkeiten, die sich in Präsenzformaten beispielsweise in den Pausen oder im Gespräch mit den Sitznachbarn ergeben, zu kompensieren. Vorliegender Beitrag stellt vor, inwieweit der Transfer in ein digitales Format gelungen ist. Das digitale Brückenkurskonzept wurde in ein didaktisches Entwurfsmuster übertragen, um durch die strukturierte und nachvollziehbare Darstellung den Transfer und die Vergleichbarkeit der Ergebnisse zu erleichtern.
In the 19th century Alexander von Humboldt explored the nature and was conceived a new vision of nature that still influences the way we understand the new world. Humboldt believed in the importance of accurate measurements and precise description of observations. His vision of nature included not only facts but also emotions.
Nowadays smart solutions will be developed by using computer technology, which will influence our relationship to nature, our handling of the complexity and diversity of nature itself and the technological influences on the society. Could we avoid a new form of “Colonialism”, when a network of super computers will create a smarter world?
The integration of additive manufacturing processes into the teaching of students is an important prerequisite for the further dissemination of this new technology. In this context, the DfAM is of particular importance. For this reason, this paper presents an approach in which a connection is made between methodical product development and practical implementation by AM. Using a model racing car as an example, students independently develop significant improvements of particular assemblies. A final evaluation shows that the students have significantly improved their skills and competencies.
State-of-the-art models for pixel-wise prediction tasks such as image restoration, image segmentation, or disparity estimation, involve several stages of data resampling, in which the resolution of feature maps is first reduced to aggregate information and then sequentially increased to generate a high-resolution output. Several previous works have investigated the effect of artifacts that are invoked during downsampling and diverse cures have been proposed that facilitate to improve prediction stability and even robustness for image classification. However, equally relevant, artifacts that arise during upsampling have been less discussed. This is significantly relevant as upsampling and downsampling approaches face fundamentally different challenges. While during downsampling, aliases and artifacts can be reduced by blurring feature maps, the emergence of fine details is crucial during upsampling. Blurring is therefore not an option and dedicated operations need to be considered. In this work, we are the first to explore the relevance of context during upsampling by employing convolutional upsampling operations with increasing kernel size while keeping the encoder unchanged. We find that increased kernel sizes can in general improve the prediction stability in tasks such as image restoration or image segmentation, while a block that allows for a combination of small-size kernels for fine details and large-size kernels for artifact removal and increased context yields the best results.
Inner Congo
(2023)
This research-creation project, part of the DE\GLOBALIZE artistic research cycle presented at the #IFM2022 Conference, investigates the complexities of Congo violence, care, and colonialism. Drawing on Michel Serres' metaphor of the great estuaries, the study explores the topology of interactive documentaries, blending theory, emotion, and personal experiences. Accessible through the interactive web documentation at http://deglobalize.com, the platform offers a media-archaeological archive for speculative ethnography, enabling the forensic processing of single documents in line with actor-network theory.
A balcony photovoltaic (PV) system, also known as a micro-PV system, is a small PV system consisting of one or two solar modules with an output of 100–600 Wp and a corresponding inverter that uses standard plugs to feed the renewable energy into the house grid. In the present study we demonstrate the integration of a commercial lithium-ion battery into a commercial micro-PV system. We firstly show simulations over one year with one second time resolution which we use to assess the influence of battery and PV size on self-consumption, self-sufficiency and the annual cost savings. We then develop and operate experimental setups using two different architectures for integrating the battery into the micro-PV system. In the passive hybrid architecture, the battery is in parallel electrical connection to the PV module. In the active hybrid architecture, an additional DC-DC converter is used. Both architectures include measures to avoid maximum power point tracking of the battery by the module inverter. Resulting PV/battery/inverter systems with 300 Wp PV and 555 Wh battery were tested in continuous operation over three days under real solar irradiance conditions. Both architectures were able to maintain stable operation and demonstrate the shift of PV energy from the day into the night. System efficiencies were observed comparable to a reference system without battery. This study therefore demonstrates the feasibility of both active and passive coupling architectures.
Artificial intelligence (AI), and in particular machine learning algorithms, are of increasing importance in many application areas but interpretability and understandability as well as responsibility, accountability, and fairness of the algorithms' results, all crucial for increasing the humans' trust into the systems, are still largely missing. Big industrial players, including Google, Microsoft, and Apple, have become aware of this gap and recently published their own guidelines for the use of AI in order to promote fairness, trust, interpretability, and other goals. Interactive visualization is one of the technologies that may help to increase trust in AI systems. During the seminar, we discussed the requirements for trustworthy AI systems as well as the technological possibilities provided by interactive visualizations to increase human trust in AI.
The use of artificial intelligence continues to impact a broad variety of domains, application areas, and people. However, interpretability, understandability, responsibility, accountability, and fairness of the algorithms' results - all crucial for increasing humans' trust into the systems - are still largely missing. The purpose of this seminar is to understand how these components factor into the holistic view of trust. Further, this seminar seeks to identify design guidelines and best practices for how to build interactive visualization systems to calibrate trust.
During the coronavirus crisis, labs had to be offered in digital form in mechanical engineering at short notice. For this purpose, digital twins of more complex test benches in the field of fluid energy machines were used in the mechanical engineering course, with which the students were able to interact remotely to obtain measurement data. The concept of the respective lab was revised with regard to its implementation as a remote laboratory. Fortunately, real-world labs were able to be fully replaced by remote labs. Student perceptions of remote labs were mostly positive. This paper explains the concept and design of the digital twins and the lab as well as the layout, procedure, and finally the results of the accompanying evaluation. However, the implementation of the digital twins to date does not yet include features which address the tactile experience of working in real-world labs.
Künstliche Intelligenz (KI) durchdringt unser Leben immer stärker. Studierende werden im Alltag und an Hochschulen zunehmend mit KI-Anwendungen konfrontiert. An der Hochschule Offenburg werden deshalb KI-bezogene Lehrangebote curricular verankert, um Studierende im Erwerb von KI-Kompetenz zu unterstützen.
Der Beitrag stellt ein Konzept für die Entwicklung von Lehrveranstaltungen nach der Idee des pädagogischen Makings zur Förderung von KI-Kompetenz in der Hochschullehre vor. Konkretisiert wird das Konzept anhand eines Moduls zum Thema Chatbots, dessen Lehrinhalte interdisziplinär aus verschiedenen Perspektiven ausgearbeitet werden.