Closed
Refine
Year of publication
Document Type
- Conference Proceeding (332)
- Bachelor Thesis (200)
- Article (unreviewed) (178)
- Part of a Book (143)
- Article (reviewed) (106)
- Master's Thesis (102)
- Contribution to a Periodical (58)
- Book (55)
- Other (42)
- Doctoral Thesis (6)
Conference Type
- Konferenzartikel (294)
- Konferenz-Abstract (16)
- Konferenz-Poster (12)
- Konferenzband (7)
- Sonstiges (2)
Keywords
- Biomechanik (28)
- Künstliche Intelligenz (20)
- Arbeitszeugnis (18)
- Social Media (18)
- ZigBee (18)
- Marketing (16)
- IT-Sicherheit (15)
- Digitalisierung (13)
- Medizintechnik (13)
- Machine Learning (11)
Institute
- Fakultät Elektrotechnik und Informationstechnik (E+I) (bis 03/2019) (286)
- Fakultät Maschinenbau und Verfahrenstechnik (M+V) (281)
- Fakultät Medien (M) (ab 22.04.2021) (241)
- Fakultät Elektrotechnik, Medizintechnik und Informatik (EMI) (ab 04/2019) (178)
- Fakultät Wirtschaft (W) (177)
- Fakultät Medien und Informationswesen (M+I) (bis 21.04.2021) (49)
- ivESK - Institut für verlässliche Embedded Systems und Kommunikationselektronik (37)
- INES - Institut für nachhaltige Energiesysteme (30)
- POIM - Peter Osypka Institute of Medical Engineering (27)
- IBMS - Institute for Advanced Biomechanics and Motion Studies (ab 16.11.2022) (19)
Open Access
- Closed (1233)
In industrial production, Explainable Artificial Intelligence offers the opportunity to better understand AI models, not only to understand their decisions but also to be able to improve processes, which is possible even without the necessary experience. There are several widely used methods to implement this for Artificial Intelligence models, but no uniform standards and definitions. There are also still some hurdles on a technical and legal level as well as trust on the human side. Ultimately, the application of such techniques is particularly important in this area, but this is often not possible without the well-functioning and widespread use of Artificial Intelligence in manufacturing industry. However, it is a further step towards intelligent manufacturing and the acceptance of Artificial Intelligence in this area. The publication not only shows the latest developments, but also some use cases of Explainable Artificial Intelligence in manufacturing. Not only legal aspects but also ethical aspects are highlighted. In conclusion, it can be stated that Explainable Artificial Intelligence methods per se are well established, but their implementation and use in practice is lagging behind. A major point here in the future will be the explanation of generative artificial intelligence and user-centred explanations for non-domain experts. A standardised definition of explainability and standards for evaluating the individual methodologies must also be created
Erlang is a dynamically typed language with support for optional type annotations. Though Erlang’s type annotations were originally intended for documentation, static analysis tools soon utilized them for semantic checks. The most advanced and mature of these tools is Dialyzer, a success typing-based tool widely used in current projects. Attempts to retrofit a static type system employing the type annotations have so far remained in the realm of research prototypes. Recently, three further tools have been developed: Gradualizer, eqWAlizer, and Etylizer. But, due to a need for more semantic agreement on Erlang’s type annotations, their results differ in ways that can be challenging for users to interpret. In this paper, we cross-compare the state-of-the-art static checkers regarding their expressivity and performance on the union of their respective test suites. Unsurprisingly, we find that the tools perform best on their own test suites. While Gradualizer, Etylizer, and eqWAlizer disagree on 25% - 45% of test cases across all test suites, Dialyzer’s success-typing approach sets it apart in its interpretation of the type annotations. Our analysis emphasizes that the nature of Erlang’s type language remains challenging when it comes to develop a correct and efficient static type checker.
Transportation planners are increasingly relying on AI to optimize logistics and solve persistent challenges. However, as AI advances rapidly, most software vendors are unable to evaluate and implement all new developments. This paper uses bibliometric methods to track and evaluate the emerging trend of neurosymbolic AI, which combines neural networks with symbolic AI to improve decision making. By analyzing literature and citation data, we gain insights into the development and impact of neurosymbolic AI. The results provide a scalable approach for practitioners to efficiently identify and evaluate AI trends to facilitate the strategic adoption of technologies and innovations in transportation planning.
This study focused on enhancing odometry estimation for self-driving cars using LiDAR-based sensor technology. The project involves integrating LiDAR sensors into the car’s sensor suite, which generates detailed 3D point clouds of the environment. This integration can be useful when gaps-based odometry estimation is not accurate enough. These point clouds are then used to accurately estimate the vehicle’s movement and position using learning-based and model-based odometry estimation methods.
Performance benchmarking is crucial for optimizing networks, including 5G Non-Public Networks (5G-NPN). Since one of the major advantages of 5G-NPN is to guarantee Quality of Service (QoS), ensuring optimal performance for their diverse applications is critical. This requires adjustments and testing of various radio-related parameters. Also, performance analysis and benchmarking need to be done based on the evaluation of relevant Key Parameter Indicators (KPIs) in order to identify the parameter set for the optimum performance for each application’s requirements. Many published results on performance benchmarking often lack transparency in their scoring methods. Additionally, QoS benchmarking evaluation for 5G-NPN use cases needs further steps due to the varying ranges of their KPIs. For example Block Error Rate (BLER) is mostly represented by percentage, while Reference Signal Received Power (RSRP) and Reference Signal Received Quality (RSRQ) are negative values.3GPP in TR 103.559 outlines practices for benchmarking network QoS, with a focus on Speech and multimedia Transmission Quality (STQ). This paper extends these outlines to evaluate 5G-NPN performance by defining a multi-objective function. We select a specific 5G-NPN use case as an example and apply four tests with varying network configurations. After each test, we collect most relevant 5G KPIs. To facilitate comparison, all KPIs are rescaled to a common scale. Additionally, we assign weights to each KPI based on its significance in the chosen use case. By combining rescaling and weight assignments, we propose a single metric that effectively characterizes the overall network performance for 5G-NPNs based on their specific use case requirements.
Variable refrigerant flow (VRF) systems are constantly prone to failures during their lifespan, causing breakdowns, high energy bills, and indoor discomfort. In addition to correctly identifying these defects, fault detection, and diagnostic studies should be able to anticipate and predict the anomalies before they occur for efficient maintenance. Therefore, this study introduces an efficient self-learning predictive maintenance system, CACMMS (Cloud Air Conditioning Monitoring & Management System), designed to anticipate refrigerant leaks in VRF systems. Unlike previous efforts, this system leverages advanced fault detection and diagnosis strategies in a real existing building to enhance prediction accuracy. The study employed three noise filtering models (Kalman filter, moving average, S-G smoothing) in the preprocessing phase. Ten features were selected for assessment, and four machine learning models (decision tree, random forest, K-nearest neighbor, support vector machine) were compared. The accuracy, precision, sensitivity, computation time as well and confusion matrix were used as performance indicators and metrics to evaluate and choose the best performant model. Results indicated that decision tree and random forest models achieved over 95 % accuracy with execution times between 0.70 s and 3.32 s, outperforming K-nearest neighbor and support vector machine models. These findings highlight the system’s potential to reduce downtime and energy costs through effective predictive maintenance.
Traditional authentication involves sharing a considerable amount of personal and identifying information. Usually, a single central authority controls the data of all their users. This creates a single point of failure and users typically have to relinquish control over their data. Therefore it is important to explore alternate authentication mechanisms to uphold data sovereignty. Data sovereignty describes forms of independence, control, and autonomy over digital data. Enforcing data sovereignty also requires independence from central authorities. This paper explores alternate decentralized authentication methods. It leverages Verifiable Credentials (VCs) which allow verification without needing to contact the issuer and self-sovereign identities in the form of Decentralized IDentifiers (DIDs). This paper aims to leverage the decentralized authentication supported by VCs and DIDs and provide two use cases that might explain how they could be used.
Over the years, the Internet of Things has brought significant benefits to modern society, lives, and industries; however, the technology used has yet to mature sufficiently to provide secure devices and communication. Recently, the number of connected devices rapidly grows, thus adversaries have more opportunities to gain access to IoT devices and use them to launch what is called large-scale attacks. With the rapid proliferation of Internet of Things (IoT) devices, the need for efficient and effective Intrusion Detection System (IDS) tailored for IoT environments has become increasingly paramount. This paper explores various techniques employed in contemporary IoT IDS, including traditional signature-based approaches like Snort and Bro/Zeek, as well as emerging deep learning-based methods.
Positioning and accurate time delay measurement techniques have been used with the Internet of Things (IoT) and embedded systems due to their importance in providing location information for the communicating nodes. In the last two decades, positioning techniques were introduced using the Received Signal Strength Indicator (RSSI) of Wi-Fi signals and time-based techniques. Fine Timing Measurement (FTM) is the most important time-based technique, which relies on the captured timestamps during the messaging between nodes. Thus, in addition to its originally intended application for wireless localization, it can be used for the future of Wi-Fi Time Sensitive Networking (WTSN), where low latency, low jitter, and precise time synchronization play an important role in Industrial IoT (IIOT)-oriented applications.The presented work considers FTM’s behavior and performance measurements, especially in a factory environment with different room sizes. An Automated Physical Test Bed (APTB) and emulated multipath propagation model based on ITU-R radio wave propagation standard for the factory environment are considered in the work. The results show that the FTM performance is noticeably affected by multipath signal propagation, thus increasing RTT, delay, and fluctuation in jitter and resulting in a noticeable degradation in RSSI. In contrast, the total number of correctly received frames is not affected, indicating the efficiency and reliability of the Wi-Fi FTM technique.