Refine
Year of publication
Document Type
- Master's Thesis (61) (remove)
Language
- English (61) (remove)
Has Fulltext
- yes (61)
Is part of the Bibliography
- no (61)
Keywords
- IT-Sicherheit (6)
- Maschinelles Lernen (4)
- Deep learning (3)
- security (3)
- Cloud Computing (2)
- Computersicherheit (2)
- Energiemanagement (2)
- Energiewende (2)
- Homomorphic Encryption (2)
- Identitätsverwaltung (2)
Institute
- Fakultät Medien (M) (ab 22.04.2021) (25)
- Fakultät Maschinenbau und Verfahrenstechnik (M+V) (14)
- Fakultät Elektrotechnik, Medizintechnik und Informatik (EMI) (ab 04/2019) (12)
- Fakultät Wirtschaft (W) (10)
- ivESK - Institut für verlässliche Embedded Systems und Kommunikationselektronik (3)
- Fakultät Elektrotechnik und Informationstechnik (E+I) (bis 03/2019) (2)
- Fakultät Medien und Informationswesen (M+I) (bis 21.04.2021) (2)
- INES - Institut für nachhaltige Energiesysteme (2)
- IUAS - Institute for Unmanned Aerial Systems (1)
Open Access
- Closed (31)
- Closed Access (24)
- Open Access (6)
- Diamond (2)
This Master's Thesis discusses intelligent sensor networks considering autonomous sensor placement strategies and system health management. Sensor networks for an intelligent system design process have been researched recently. These networks consist of a distributed collective of sensing units, each with the abilities of individual sensing and computation. Such systems can be capable of self-deployment and must be scalable, long-lived and robust. With distributed sensor networks, intelligent sensor placement for system design and online system health management are attractive areas of research. Distributed sensor networks also cause optimization problems, such as decentralized control, system robustness and maximization of coverage in a distributed system. This also includes the discovery and analysis of points of interest within an environment. The purpose of this study was to investigate a method to control sensor placement in a world with several sources and multiple types of information autonomously. This includes both controlling the movement of sensor units and filtering of the gathered information depending on individual properties to increase system performance, defined as a good coverage. Additionally, online system health management was examined in this study regarding the case of agent failures and autonomous policy reconfiguration if sensors are added to or removed from the system. Two different solution strategies were devised, one where the environment was fully observable, and one with only partial observability. Both strategies use evolutionary algorithms based on artificial neural networks for developing control policies. For performance measurement and policy evaluation, different multiagent objective functions were investigated. The results of the study show that in the case of a world with multiple types of information, individual control strategies performed best because of their abilities to control the movement of a sensor entity and to filter the sensed information. This also includes system robustness in case of sensor failures where other sensing units must recover system performance. Additionally, autonomous policy reconfiguration after adding or removing of sensor agents was successful. This highlights that intelligent sensor agents are able to adapt their individual control policies considering new circumstances.
Distributed Flow Control and Intelligent Data Transfer in High Performance Computing Networks
(2015)
This document contains my master thesis report, including problem definition, requirements, problem analysis, review of current state of the art, proposed solution,
designed prototype, discussions and conclusion.
During this work we propose a collaborative solution to run different types of operations in a broker-less network without relying on a central orchestrator.
Based on our requirements, we define and analyze a number of scenarios. Then we design a solution to address those scenarios using a distributed workflow management approach. We explain how we break a complicated operation into simpler parts and how we manage it in a non-blocking and distributed way. Then we show how we asynchronously launch them on the network and how we collect and aggregate results. Later on we introduce our prototype which demonstrates the proposed design.
Quarz crystal microbalances allow the monitoring of the adsorption process of mass from a liquid to their surface. The adsorbed mass can be analysed regarding to its protein content using mass spectromety. To ensure the protein identification the results of several measurements can be combined. A high content QCM-D array was developed to allow up to ten measurements parallel. The samples can be routed inside the array distributing one sample to several chips. The fluidic parts were prototyped using 3D printing. The assembled array was tight and the sample routing function could be demonstrated. A temperature controller was developed and implemented. The parameters for the PID controller were determined and the controller was shown to be able to keep the temperature constant over long time with high accuracy.
Die Vision vom "Internet der Dinge" prägt seit Jahren Forschung und Entwicklung, wenn es um smarte Technologien und die Vernetzung von Geräten geht. In der Zukunft wird die reale Welt zunehmend mit dem Internet verknüpft, wodurch zahlreiche Gegenstände (Dinge) des normalen Alltags dazu befähigt werden, zu interagieren und sowohl online als auch autark zu kommunizieren. Viele Branchen wie Medizin, Automobilbau, Energieversorgung und Unterhaltungselektronik sind gleichermaßen betroffen, wodurch trotz Risiken auch neues wirtschaftliches Potential entsteht. Im Bereich "Connected Home" sind bereits Lösungen vorhanden, mittels intelligenter Vernetzung von Haushaltsgeräten und Sensoren, die Lebensqualität in den eigenen vier Wänden zu erhöhen. Diese Arbeit beschäftigt sich mit dem Thread Protokoll; einer neuen Technologie zur Integration mehrerer Kommunikationsschnittstellen innerhalb eines Netzwerks. Darüber hinaus wird die Implementierung auf Netzwerkebene (Network Layer) vorgestellt, sowie aufbereitete Informationen bezüglich verwendeter Technologien dargestellt.
Singapore’s success in transforming itself from a poor, vulnerable economy to one of the richest countries in the world (IMF, 2016) is nothing short of inspirational to many small economies around the globe. Given its lack of resources, Singapore relied upon foreign investors to fuel its growth not only through cash injection into the economy in the form of Foreign Direct Investments (FDI) but also to help upgrade its skills and technological stock. This study looks at how Singapore inspired many Multi-National Corporations (MNCs) into pouring a large sum of investments into this small ailing citystate and if this idea can be generalized to apply it in other economies, especially in Oman.
In a bid to explain the large flow of Capital into an economy, this study moves on further to review most prominent literature in the field since Macdougall (1958) first laid the groundwork for the subsequent theories on FDI. Based on the review of several previous studies, the most significant determinants of FDI were found to be government policy and political stability, inflation rate as a proxy for economic stability, quality of infrastructure and institutions, market size of the host country, openness to trade, tax policies and access to low cost factors of production.
Through a case study method with the inductive approach, this study finds that Singapore excels in all of the determinants of FDI except for the market size of the host country and access to low-cost factors of production. However, it more than compensates for these shortcomings with its strategic geographical location and numerous bilateral and regional trade agreements that give it access to markets around the region. Oman like Singapore ranks well in many of these determinants that make it a potential destination for investment. However, the sultanate could gain more interest from the MNC’s to help its growth by optimizing its policies to lower existing barriers, easing immigration laws to meet the short term skill shortage, allowing for 100 percent foreign ownership, allowing for more liberal property rights, working to improve corruption perception and opting for more trade agreements to give it easy access to larger markets. Moreover, the economy’s heavy reliance on hydrocarbon exports is seen as a major risk by investors as it creates an economic vulnerability which could potentially overshadow many other benefits of investing in the sultanate. Besides the aforementioned determinants, a lot also depends on the success of Oman’s diversification plans.
Webassembly is a new technology to create application in a new way. Webassembly is being developed since 2017 by the worldwide web consortium (w3c). The primary task of webassembly is to improve web applications.
Today, more and more applications are being created as web applications. Web applications have some advantages - they are platform independent and even mobile platforms can run them, and no installation is needed apart from a modern web browser.
Currently, web applications are being developed in JavaScript (JS), hypertext mark-up language 5 (HTML 5), and cascading style sheets (CSS).
These technologies are not made for huge web applications, but they should not be replaced by webassembly; rather, webassembly is an extension to the currently existing technology.
The purpose of webassembly is to fix or improve the problems in web application development.
This master’s thesis reviews all of the aspects and checks whether the promises of webassembly are kept and where problems still exist.
Annotated training data is essential for supervised learning methods. Human annotation is costly and laborsome especially if a dataset consists of hundreds of thousands of samples and annotators need to be hired. Crowdsourcing emerged as a solution that makes it easier to get access to large amounts of human annotators. Introducing paid external annotators however introduces malevolent annotations, both intentional and unintentional. Both forms of malevolent annotations have negative effects on further usage of the data and can be summarized as spam. This work explores different approaches to post-hoc detection of spamming users and which kinds of spam can be detected by them. A manual annotation checking process resulted in the creation of a small user spam dataset which is used in this thesis. Finally an outlook for future improvements of these approaches will be made.
Communication protocols enable information exchange between different information systems. If protocol descriptions for these systems are not available, they can be reverse-engineered for interoperability or security reasons. This master thesis describes the analysis of such a proprietary binary protocol, named the DVRIP or Dahua private protocol from Dahua Technology. The analysis contains the identification of the DVRIP protocol header format, security mechanisms and vulnerabilities inside the protocol implementation. With the revealing insights of the protocol, an increase of the overall security is achieved. This thesis builds the foundation for further targeted security analyses.
The status quo of PROFINET, a commonly used industrial Ethernet standard, provides no inherent security in its communication protocols. In this thesis an approach for protecting real-time PROFINET RTC messages against spoofing, tampering and optionally information disclosure is specified and implemented into a real-world prototype setup. Therefor authenticated encryption is used, which relies on symmetric cipher schemes. In addition a procedure to update the used symmetric encryption key in a bumpless manner, e.g. without interrupting the real-time communication, is introduced and realized.
The concept for protecting the PROFINET RTC messages was developed in collaboration with a task group within the security working group of PROFINET International. The author of this thesis has also been part of that task group. This thesis contributes by proofing the practicability of the concept in a real-world prototype setup, which consists of three FPGA-based development boards that communicate with each other to showcase bumpless key updates.
To enable a bumpless key update without disturbing the deterministic real-time traffic by dedicated messages, the key update annunciation and status is embedded into the header. By provisioning two key slots, of which only one is in used, while the other is being prepared, a well-synchronized coordinated switch between the receiver and the sender performs the key update.
The developed prototype setup allows to test the concept and builds the foundation for further research and implementation activities, e.g. the impact of cryptographic operations onto the processing time.
Among the billions of smartphone users in the world, Android still holds more than 80% of the market share. The applications which the users install have a specific set of features that need access to some device functionalities and sensors that may hold sensitive information about the user. Therefore, Android releases have set permission standards to let the user know what information is being disclosed to the application. Along with other security and privacy improvements, significant changes to the permission scheme are introduced with the Android 6.0 version (API level 23). In this master thesis, the Android permission scheme is tested on two devices from different eras. The evolution of Android over the years is examined in terms of confidentiality. For each device, two applications are built; one focused on extracting every piece of information within the confidentiality scope with every permission declared and/or requested, and the other app focused on getting this type of information without user notification. The resulting analysis illustrates whether how and in what way the Android permission scheme declined or improved over time.