Refine
Document Type
Conference Type
- Konferenzartikel (7)
Language
- English (11)
Is part of the Bibliography
- yes (11)
Keywords
- Algorithmus (2)
- E-Learning (2)
- Virtuelle Realität (2)
- 3D virtual reality (1)
- Cloud Security (1)
- Cloud Service Provider (1)
- Cloud User (1)
- Funktechnik (1)
- Mobiles Endgerät (1)
- Optik (1)
Institute
- Fakultät Medien und Informationswesen (M+I) (bis 21.04.2021) (11) (remove)
Open Access
- Open Access (5)
- Closed Access (3)
- Closed (1)
The interaction between agents in multiagent-based control systems requires peer to peer communication between agents avoiding central control. The sensor nodes represent agents and produce measurement data every time step. The nodes exchange time series data by using the peer to peer network in order to calculate an aggregation function for solving a problem cooperatively. We investigate the aggregation process of averaging data for time series data of nodes in a peer to peer network by using the grouping algorithm of Cichon et al. 2018. Nodes communicate whether data is new and map data values according to their sizes into a histogram. This map message consists of the subintervals and vectors for estimating the node joining and leaving the subinterval. At each time step, the nodes communicate with each other in synchronous rounds to exchange map messages until the network converges to a common map message. The node calculates the average value of time series data produced by all nodes in the network by using the histogram algorithm. The relative error for comparing the output of averaging time series data, and the ground truth of the average value in the network will decrease as the size of the network increases. We perform simulations which show that the approximate histograms method provides a reasonable approximation of time series data.
In 2000 the iSign project started as a virtual web-based laboratory for students of study program electrical engineering. Continuous development in the last years led to a heterogeneous learning environment offering learning material, adaptive user settings and access to a simulation tool. Access is available via web and wireless devices such as PCs, Laptops, PDAs, smartphones and mobile phones. Our attempt to adapt the content to the user's needs and the currently used device led us to a XML based data structure. This report shows our research results about content adaptation based on XML data. The two main aspects for that process are: the device capabilities and the adaptation methods using XML data.
Signal detection and bandwidth estimation, also known as channel segmentation or information channel estimation, is a perpetual topic in communication systems. In the field of radio monitoring this issue is extremely challenging, since unforeseeable effects like fading occur accidentally. In addition, most radio monitoring devices normally scan a wide frequency range of several hundred MHz and have to detect a multitude of different signals, varying in signal power, bandwidth and spectral shape. Since narrowband sensing techniques cannot be directly applied, most radio monitoring devices use Nyquist wideband sensing to discover the huge frequency range. In practice, sensing is normally conducted by an FFT sweep spectrum analyzer that delivers the power spectral density (PSD) values to the radio monitoring system. The channel segmentation is the initial step of a comprehensive signal analysis in a radio monitoring system based on the PSD values. In this paper, a novel approach for channel segmentation is presented that is based on a quantization and a histogram evaluation of the measured PSD. It will be shown that only the combination of both evaluations will lead to an successful automatic channel segmentation. The performance of the proposed algorithm is shown in a real radio monitoring szenario.
Recent developments in information and communication technology, along with advanced displaying techniques and high computational performance open up new visualisation methods to both scientists and lecturers. Thus simulations of complex processes [1] can be computed and visualised in image sequences. The particular idea in our approach is the outsourcing of computationally intensive calculations to servers which then send the results back to mobile users. In order to improve interpretations of the visualised results, users can view them in a 3D-perspective or stereoscopically, given the technical requirements. Today’s technology even permits to view these visualisations on a mobile phone. An example for such a computationally intensive calculation originating from the theory of relativity is depicted in Figure 4.1-1.
The developed solution enables the presentation of animations and 3D virtual reality (VR) on mobile devices and is well suited for mobile learning, thus creating new possibilities in the area of e-learning worldwide. Difficult relations in physics as well as intricate experiments in optics can be visualised on mobile devices without need for a personal computer.
Mobile learning (m-learning) can be considered as a new paradigm of e-learning. The developed solution enables the presentation of animations and 3D virtual reality (VR) on mobile devices and is well suited for mobile learning. Difficult relations in physics as well as intricate experiments in optics can be visualised on mobile devices without need for a personal computer. By outsourcing the computational power to a server, the coverage is worldwide.
Computing Aggregates on Autonomous, Self-organizing Multi-Agent System: Application "Smart Grid"
(2017)
Decentralized data aggregation plays an important role in estimating the state of the smart grid, allowing the determination of meaningful system-wide measures (such as the current power generation, consumption, etc.) to balance the power in the grid environment. Data aggregation is often practicable if the aggregation is performed effectively. However, many existing approaches are lacking in terms of fault-tolerance. We present an approach to construct a robust self-organizing overlay by exploiting the heterogeneous characteristics of the nodes and interlinking the most reliable nodes to form an stable unstructured overlay. The network structure can recover from random state perturbations in finite time and tolerates substantial message loss. Our approach is inspired from biological and sociological self-organizing mechanisms.
Logging information is more precious as it contains the execution of a system; it is produced by millions of events from simple application logins to random system errors. Most of the security related problems in the cloud ecosystem like intruder attacks, data loss, and denial of service, etc. could be avoided if Cloud Service Provider (CSP) or Cloud User (CU) analyses the logging information. In this paper we introduced few challenges, which are place of monitoring, security, and ownership of the logging information between CSP and CU.
Also we proposed a logging architecture to analyze the behaviour of the cloud ecosystem, to avoid data breaches and other security related issues at the CSP space. So that we believe our proposed architecture can provide maximum trust between CU and CSP.