Refine
Document Type
- Master's Thesis (13)
- Bachelor Thesis (2)
- Article (reviewed) (1)
Language
- English (16) (remove)
Has Fulltext
- yes (16)
Is part of the Bibliography
- no (16) (remove)
Keywords
- Deep learning (2)
- Energiemanagement (2)
- IT-Sicherheit (2)
- Landwirtschaft (2)
- 3D-CAD (1)
- 5G (1)
- Agrophotovoltaics (1)
- Algorithmus (1)
- Audiometrie (1)
- BlueZ (1)
Institute
- Fakultät Elektrotechnik, Medizintechnik und Informatik (EMI) (ab 04/2019) (16) (remove)
Open Access
- Closed (8)
- Closed Access (5)
- Open Access (3)
- Diamond (2)
- Gold (1)
This thesis focuses on the development and implementation of a Datagram Transport Layer Security (DTLS) communication framework within the ns-3 network simulator, specifically targeting the LoRaWAN model network. The primary aim is to analyse the behaviour and performance of DTLS protocols across different network conditions within a LoRaWAN context. The key aspects of this work include the following.
Utilization of ns-3: This thesis leverages ns-3’s capabilities as a powerful discrete event network simulator. This platform enables the emulation of diverse network environments, characterized by varying levels of latency, packet loss, and bandwidth constraints.
Emulation of Network Challenges: The framework specifically addresses unique challenges posed by certain network configurations, such as duty cycle limitations. These constraints, which limit the time allocated for data transmission by each device, are crucial in understanding the real-world performance of DTLS protocols.
Testing in Multi-client-server Scenarios: A significant feature of this framework is its ability to test DTLS performance in complex scenarios involving multiple clients and servers. This is vital for assessing the behaviour of a protocol under realistic network conditions.
Realistic Environment Simulation: By simulating challenging network conditions, such as congestion, limited bandwidth, and resource constraints, the framework provides a realistic environment for thorough evaluation. This allows for a comprehensive analysis of DTLS in terms of security, performance, and scalability.
Overall, this thesis contributes to a deeper understanding of DTLS protocols by providing a robust tool for their evaluation under various and challenging network conditions.
AI-based Ground Penetrating Radar Signal Processing for Thickness Estimation of Subsurface Layers
(2023)
This thesis focuses on the estimation of subsurface layer thickness using Ground Penetrating Radar (GPR) A-scan and B-scan data through the application of neural networks. The objective is to develop accurate models capable of estimating the thickness of up to two subsurface layers.
Two different approaches are explored for processing the A-scan data. In the first approach, A-scans are compressed using Principal Component Analysis (PCA), and a regression feedforward neural network is employed to estimate the layers’ thicknesses. The second approach utilizes a regression one-dimensional Convolutional Neural Network (1-D CNN) for the same purpose. Comparative analysis reveals that the second approach yields superior results in terms of accuracy.
Subsequently, the proposed 1-D CNN architecture is adapted and evaluated for Step Frequency Continuous Wave (SFCW) radar, expanding its applicability to this type of radar system. The effectiveness of the proposed network in estimating subsurface layer thickness for SFCW radar is demonstrated.
Furthermore, the thesis investigates the utilization of GPR B-scan images as input data for subsurface layer thickness estimation. A regression CNN is employed for this purpose, although the results achieved are not as promising as those obtained with the 1-D CNN using A-scan data. This disparity is attributed to the limited availability of B-scan data, as B-scan generation is a resource-intensive process.
Annotated training data is essential for supervised learning methods. Human annotation is costly and laborsome especially if a dataset consists of hundreds of thousands of samples and annotators need to be hired. Crowdsourcing emerged as a solution that makes it easier to get access to large amounts of human annotators. Introducing paid external annotators however introduces malevolent annotations, both intentional and unintentional. Both forms of malevolent annotations have negative effects on further usage of the data and can be summarized as spam. This work explores different approaches to post-hoc detection of spamming users and which kinds of spam can be detected by them. A manual annotation checking process resulted in the creation of a small user spam dataset which is used in this thesis. Finally an outlook for future improvements of these approaches will be made.
When a patient with hearing aids needs to partake in audiometry procedures they need to visit a specialist which costs both time and money. Ideally, the patient should be able to conduct these tests alone, during their own time, and without additional costs. With this idea comes the question of if whether this is possible or not, and, if it is, how.
This thesis explores the throughput of Bluetooth Low Energy and if it is configurable to have a high enough data rate to send high quality audio data with a lossless audio codec while communicating with a low end device. Additionally, this thesis will show that using Rust to develop embedded software is possible and how using it can make the process of doing so easier.
The status quo of PROFINET, a commonly used industrial Ethernet standard, provides no inherent security in its communication protocols. In this thesis an approach for protecting real-time PROFINET RTC messages against spoofing, tampering and optionally information disclosure is specified and implemented into a real-world prototype setup. Therefor authenticated encryption is used, which relies on symmetric cipher schemes. In addition a procedure to update the used symmetric encryption key in a bumpless manner, e.g. without interrupting the real-time communication, is introduced and realized.
The concept for protecting the PROFINET RTC messages was developed in collaboration with a task group within the security working group of PROFINET International. The author of this thesis has also been part of that task group. This thesis contributes by proofing the practicability of the concept in a real-world prototype setup, which consists of three FPGA-based development boards that communicate with each other to showcase bumpless key updates.
To enable a bumpless key update without disturbing the deterministic real-time traffic by dedicated messages, the key update annunciation and status is embedded into the header. By provisioning two key slots, of which only one is in used, while the other is being prepared, a well-synchronized coordinated switch between the receiver and the sender performs the key update.
The developed prototype setup allows to test the concept and builds the foundation for further research and implementation activities, e.g. the impact of cryptographic operations onto the processing time.
Decarbonisation Strategies in Energy Systems Modelling: APV and e-tractors as Flexibility Assets
(2023)
This work presents an analysis of the impact of introducing Agrophotovoltaic technologies and electric tractors into Germany’s energy system. Agrophotovoltaics involves installing photovoltaic systems in agricultural areas, allowing for dual usage of the land for both energy generation and food production. Electric tractors, which are agricultural machinery powered by electric motors, can also function as energy storage units, providing flexibility to the grid. The analysis includes a sensitivity study to understand how the availability of agricultural land influences Agrophotovoltaic investments, followed by the examination of various scenarios that involve converting diesel tractors to electric tractors. These scenarios are based on the current CO2 emission reduction targets set by the German Government, aiming for a 65% reduction below 1990 levels by 2030 and achieving zero emissions by 2045. The results indicate that approximately 3% of available agricultural land is necessary to establish a viable energy mix in Germany. Furthermore, the expansion of electric tractors tends to reduce the overall system costs and enhances the energy-cost-efficiency of Agrophotovoltaic investments.
Conceptualization and implementation of automated optimization methods for private 5G networks
(2023)
Today’s companies are adjusting to the new connectivity realities. New applications require more bandwidth, lower latency, and higher reliability as industries become more distributed and autonomous. Private 5th Generation (5G) networks known as 5G Non-Public Networks (5G-NPN), is a novel 3rd Generation Partnership Project (3GPP)- based 5G network that can deliver seamless and dedicated wireless access for a particular industrial use case by providing the mentioned application’s requirements. To meet these requirements, several radio-related aspects and network parameters should be considered. In many cases, the behavior of the link connection may vary based on wireless conditions, available network resources, and User Equipment (UE) requirements. Furthermore, Optimizing these networks can be a complex task due to the large number of network parameters and KPIs that need to be considered. For these reasons, traditional solutions and static network configuration are not affordable or simply impossible. Despite the existence of papers in the literature that address several optimization methods for cellular networks in industrial scenarios, more insight into these existing but complex or unknown methods is needed.
In this thesis, a series of optimization methods were implemented to deliver an optimal configuration solution for a 5G private network. To facilitate this implementation, a testing system was implemented. This system enables remote control over the UE and 5G network, establishment of a test environment, extraction of relevant KPI reports from both UE and network sides, assessment of test results and KPIs, and effective utilization of the optimization and sampling techniques.
The research highlights the advantageous aspects of automated testing by using OFAT, Simulated Annealing, and Random Forest Regressor methods. With OFAT, as a common sampling method, a sensitivity analysis and an impact of each single parameter variation on the performance of the network were revealed. With Simulated Annealing, an optimal solution with MSE of roughly 10 was revealed. And, in the Random Forest Regressor, it was seen that this method presented a significant advantage over the simulated annealing method by providing substantial benefits in time efficiency due to its machine- learning capability. Additionally, it was seen that by providing a larger dataset or using some other machine-learning techniques, the solution might be more accurate.
Server Side Rendering (SSR), Single Page Application (SPA), and Static Site Generation (SSG) are the three most popular ways of making modern Web applications today. If we go deep into these processes, this can be helpful for the developers and clients. Developers benefit since they do not need to learn other programming languages and can instead utilize their own experience to build different kinds of Web applications; for example, a developer can use only JavaScript in the three approaches. On the other hand, clients can give their users a better experience.
This Master Thesis’s purpose was to compare these processes with a demo application for each and give users a solid understanding of which process they should follow. We discussed the step-by-step process of making three applications in the above mentioned categories. Then we compared those based on criteria such as performance, security, Search Engine Optimization, developer preference, learning curve, content and purpose of the Web, user interface, and user experience. It also talked about the technologies such as JavaScript, React, Node.js, and Next.js, and why and where to use them. The goals we specified before the program creation were fulfilled and can be validated by comparing the solutions we gave for user problems, which was the application’s primary purpose.
This research presents a comprehensive exploration of hydroponic systems and their practical applications, with a focus on innovative solutions for managing environmental and analytical sensors in hydroponic setups. Hydroponic systems, which enable soilless cultivation, have gained increasing importance in modern agriculture due to their resource-efficient and high-yield nature.
The study delves into the development and deployment of the SensVert system, an adaptable solution tailored for hydroponic environments. SensVert offers adaptability and accessibility to farmers across various agricultural domains, addressing contemporary challenges in supervising and managing environmental and analytical sensors within hydroponic setups. Leveraging LoRa technology for seamless wireless data transmission, SensVert empowers users with a feature-rich dashboard for real-time monitoring and control. The study showcases the practical implementation of SensVert through a single sensor node, seamlessly integrating temperature, humidity, pressure, light, and pH sensors. The system automates pH regulation, employing the Henderson-Hasselbalch equation, and precisely controls liquid dosing using a PID controller. At the core of SensVert lies an architecture comprising The Things Stack as the network server, Node-Red as the application server, and Grafana as the user interface. These components synergize within a local network hosted on a Raspberry Pi; effectively mitigating challenges associated with data packet transmission in areas with limited internet connectivity.
As part of ongoing research, this work also paves the way for future advancements. These include the establishment of a wireless sensor network (WSN) utilizing LoRa technology, enabling seamless over-the-air sensor node updates for maintenance or replacement scenarios. These enhancements promise to further elevate the system's reliability and functionality within hydroponic cultivation, fostering sustainable agricultural practices.
The current thesis conducts the study on the integration of digitalization techniques aimed at improving energy supply efficiency in off-grid energy systems. The primary objective is to fortify the security of energy supply in remote areas, particularly in instances of adverse weather conditions, unanticipated changes in load and fluctuations in the performance of renewable energy systems. This objective is to be achieved through the implementation of a smart load management strategy in stand-alone photovoltaic systems (SAPVS). This strategy involves deployment of forecasting algorithms on an edge device that operates with limited processing resources in an environment characterized for the lack of internet connection. The edge device is designed to interact with a smart home gateway that prioritizes, and schedules smart appliances based on the forecasted state of charge (SOC) in the 36-hours ahead of the SAPVS operation (the implementation of the loads schedule deployed on the Home Assistant device is out of the scope of the tasks implemented for this project).
The edge device, developed using a Raspberry Pi 3B+, was specifically intended for being implemented along with a SAPVS, in remote areas such as health stations in Africa and tropical islands, providing communities with a reliable source of electrical energy. The deployment of the strategy was carried out in four phases. The first phase involved the implementation of an Extraction-Transformation-Load (ETL) pipeline, where data was gathered from various heterogeneous hardware sources of an implemented test system that served as the enabler and testbench of this research, this test stand is composed of power electronics components such as an inverter, a MPPT solar charge controller, a smart meter, and a BOS LiFePo4 battery prototype. In the transformation stage, a data model was developed to identify the most critical parameters of the energy system, and to eliminate outliers and null values. In the load stage, a local SQL database was established for saving and structuring the data gathered and to ensure high-quality data with defined units and casting.
The second phase involved data analysis to identify the relevant features and potential exogenous variables for the forecasting model to implement. In the third phase, an Auto Regressive Moving Average (ARMA) model with two selected exogenous variables was implemented to forecast the AC load consumption profile for the 36- hours ahead of the off-grid system operation. The final phase involved the information exchange with the Home Assistant device, by transferring to it from the edge device the battery SOC present value and the predicted 36-hour ahead AC load profile information for prioritization and scheduling of loads; this through an MQTT interface.
The outcome of the experiment was a successful deployment of a data engineering and data forecasting approach that enabled data quality strategy implementation, local database storage, and forecasting algorithms on a processing and internet-constrained edge device. The interface with a home assistant implementation resulted in the successful execution of smart load management endeavors in an off-grid system, thereby enhancing the energy security of supply and contributing to the advancement of data-driven strategies in the rural electrification sector.
This thesis emphasizes the significance of digitalization strategies in smart SAPVS and highlights the potential of edge computing solutions in achieving seamless energy management in smart homes.
Organizations striving to achieve success in the long term must have a positive brand image which will have direct implications on the business. In the face of the rising cyber threats and intense competition, maintaining a threat-free domain is an important aspect of preserving that image in today's internet world. Domain names are often near-synonyms for brand names for numerous companies. There are likely thousands of domains that try to impersonate the big companies in a bid to trap unsuspecting users, usually falling prey to attacks such as phishing or watering hole. Because domain names are important for organizations for running their business online, they are also particularly vulnerable to misuse by malicious actors. So, how can you ensure that your domain name is protected while still protecting your brand identity? Brand Monitoring, for example, may assist. The term "Brand Monitoring" applies only to keep tabs on an organization's brand performance, reception, and overall online presence through various online channels and platforms [1]. There has been a rise in the need of maintaining one's domain clear of any linkages to malicious activities as the threat environment has expanded. Since attackers are targeting domain names of organizations and luring unsuspecting users to visit malicious websites, domain monitoring becomes an important aspect. Another important aspect of brand abuse is how attackers leverage brand logos in creating fake and phishing web pages. In this Master Thesis, we try to solve the problem of classification of impersonated domains using rule-based and machine learning algorithms and automation of domain monitoring. We first use a rule-based classifier and Machine Learning algorithms to classify the domains gathered into two buckets – "Parked" and "Non-Parked". In the project's second phase, we will deploy object detection models (Scale Invariant Feature Transform - SIFT and Multi-Template Matching – MTM) to detect brand logos from the domains of interest.
"Ad fontes!"
Francesco Petrarca (1301–1374)
In the beginning, there was an idea: the reconstruction of the first "Iron Hand" of the Franconian imperial knight Götz von Berlichingen (1480–1562). We found that with this historical prosthesis, simple actions for daily use, such as holding a wine glass, a mobile phone, a bicycle handlebar grip, a horse’s reins, or some grapes, are possible without effort. Controlling this passive artificial hand, however, is based on the help of a healthy second hand.
The goal of this thesis is to thoroughly investigate the concepts of stand-alone and decarbonization of optical fiber networks. Because of their dependability, fast speed, and capacity, optical fiber networks are vital inmodern telecommunications. Their considerable energy consumption and carbon emissions, on the other hand, constitute a danger to global sustainability objectives and must be addressed.
The first section of the thesis presents a summary of the current state of optical fiber networks, their
components, and the energy consumption connected with them. This part also goes over the difficulties of lowering energy usage and carbon emissions while preserving network performance and dependability.
The second section of the thesis focuses on the stand-alone idea, which entails powering the optical fiber network with renewable energy sources and energy-efficient technology. This section investigates and explores the possibilities of renewable energy sources like solar and wind power to power the network. It also investigates energy-efficient technologies like virtualization and cloud computing, as well as their potential to minimize network energy usage.
The third section of the thesis focuses on the notion of decarbonization, which entails lowering carbon emissions linked with the optical fiber network. This section looks at various carbon-reduction measures, such as employing low-carbon energy sources and improving energy efficiency. It also covers the relevance of carbon offsets and the difficulties associated with adopting decarbonization measures in the context of optical fiber networks.
The fourth section of the thesis compares the ideas of stand-alone and decarbonization. It investigates the advantages and disadvantages of each strategy, as well as their potential to minimize energy consumption and carbon emissions in optical fiber networks. It also explores the difficulties in applying these notions as well as potential hurdles to their wider adoption.
Finally, the need of addressing the energy consumption and carbon emissions connected with optical fiber networks is emphasized in this thesis.
It outlines important obstacles and potential impediments to adopting these initiatives and gives insights into potential ways for decreasing them.
It also makes suggestions for further study in this area.
In the past ten years, applications of artificial neural networks have changed dramatically. outperforming earlier predictions in domains like robotics, computer vision, natural language processing, healthcare, and finance. Future research and advancements in CNN architectures, Algorithms and applications are expected to revolutionize various industries and daily life further. Our task is to find current products that resemble the given product image and description. Deep learning-based automatic product identification is a multi-step process that starts with data collection and continues with model training, deployment, and continuous improvement. The caliber and variety of the dataset, the design selected, and ongoing testing and improvement all affect the model's effectiveness. We achieved 81.47% training accuracy and 72.43% validation accuracy for our combined text and image classification model. Additionally, we have discussed the outcomes from the other dataset and numerous methods for creating an appropriate model.
As the Industry 4.0 is evolving, the previously separated Operational Technology (OT) and Information Technology (IT) is converging. Connecting devices in the industrial setting to the Internet exposes these systems to a broader spectrum of cyber-attacks. The reason is that since OT does not have much security measures as much as IT, it is more vulnerable from the attacker's perspective. Another factor contributing to the vulnerability of OT is that, when it comes to cybersecurity, industries have focused on protecting information technology and less prioritizing the control systems. The consequences of a security breach in an OT system can be more adverse as it can lead to physical damage, industrial accidents and physical harm to human beings. Hence, for the OT networks, certificate-based authentication is implemented. This involves stages of managing credentials in their communication endpoints. In the previous works of ivESK, a solution was developed for managing credentials. This involves a CANopen-based physical demonstrator where the certificate management processes were developed. The extended feature set involving certificate management will be based on the existing solution. The thesis aims to significantly improve such a solution by addressing two key areas that is enhancing functionality and optimizing real-time performance. Regarding the first goal, firstly, an analysis of the existing feature set shall be carried out, where the correct functionality shall be guaranteed. The limitations from the previously implemented system will be addressed and to make sure it can be applied to real world scenarios, it will be implemented and tested in the physical demonstrator. This will lay a concrete foundation that these certificate management processes can be used in the industries in large-scale networks. Implementation of features like revocation mechanism for certificates, automated renewal of the credentials and authorization attribute checks for the certificate management will be implemented. Regarding the second goal, the impact of credential management processes on the ongoing CANopen real-time traffic shall be a studied. Since in real life scenarios, mission-critical applications like Industrial control systems, medical devices, and transportation networks rely on real-time communication for reliable operation, delays or disruptions caused by credential management processes can have severe consequences. Optimizing these processes is crucial for maintaining system integrity and safety. The effect to minimize the disturbance of the credential management processes on the normal operation of the CANopen network shall be characterized. This shall comprise testing real-time parameters in the network such as CPU load, network load and average delay. Results obtained from each of these tests will be studied.
Garbage in, Garbage out: How does ambiguity in data affect state-of-the-art pedestrian detection?
(2024)
This thesis investigates the critical role of data quality in computer vision, particularly in the realm of pedestrian detection. The proliferation of deep learning methods has emphasised the importance of large datasets for model training, while the quality of these datasets is equally crucial. Ambiguity in annotations, arising from factors like mislabelling, inaccurate bounding box geometry and annotator disagreements, poses significant challenges to the reliability and robustness of the pedestrian detection models and their evaluation. This work aims to explore the effects of ambiguous data on model performance with a focus on identifying and separating ambiguous instances, employing an ambiguity measure utilizing annotator estimations of object visibility and identity. Through accurate experimentation and analysis, trade-offs between data cleanliness and representativeness, noise removal and retention of valuable data emerged, elucidating their impact on performance metrics like the log average miss-rate, recall and precision. Furthermore, a strong correlation between ambiguity and occlusion was discovered with higher ambiguity corresponding to greater occlusion prevalence. The EuroCity Persons dataset served as the primary dataset, revealing a significant proportion of ambiguous instances with approximately 8.6% ambiguity in the training dataset and 7.3% in the validation set. Results demonstrated that removing ambiguous data improves the log average miss-rate, particularly by reducing the false positive detections. Augmentation of the training data with samples from neighbouring classes enhanced the recall but diminished precision. Error correction of wrong false positives and false negatives significantly impacts model evaluation results, as evidenced by shifts in the ECP leaderboard rankings. By systematically addressing ambiguity, this thesis lays the foundation for enhancing the reliability of computer vision systems in real-world applications, motivating the prioritisation of developing robust strategies to identify, quantify and address ambiguity.