Refine
Year of publication
Document Type
- Conference Proceeding (344)
- Article (reviewed) (135)
- Article (unreviewed) (94)
- Book (33)
- Part of a Book (31)
- Patent (30)
- Letter to Editor (13)
- Contribution to a Periodical (8)
- Report (2)
- Doctoral Thesis (1)
Conference Type
- Konferenzartikel (235)
- Konferenz-Abstract (75)
- Sonstiges (22)
- Konferenz-Poster (9)
- Konferenzband (3)
Language
- English (460)
- German (227)
- Other language (2)
- Multiple languages (1)
- Russian (1)
- Spanish (1)
Has Fulltext
- no (692) (remove)
Is part of the Bibliography
- yes (692) (remove)
Keywords
- RoboCup (20)
- Kommunikation (15)
- Mathematik (12)
- Eingebettetes System (8)
- Intelligentes Stromnetz (8)
- Brennstoffzelle (7)
- CST (7)
- Energieversorgung (7)
- HF-Ablation (7)
- Herzkrankheit (7)
Institute
- Fakultät Elektrotechnik und Informationstechnik (E+I) (bis 03/2019) (692) (remove)
Open Access
- Closed Access (244)
- Open Access (221)
- Closed (73)
- Bronze (56)
- Diamond (1)
- Grün (1)
This paper describes the Sweaty II humanoid adult size robot trying to qualify for the RoboCup 2018 adult size humanoid competition. Sweaty came 2nd in RoboCup 2017 adult size league. The main characteristics of Sweaty are described in the Team Description Paper 2017. The improvements that have been made or are planned to be implemented for RoboCup 2018 are described in this paper.
Soccer simulation league is one of the founding leagues of RoboCup. In this paper we discuss the past, present and planned future achievements and changes. Also we summarize the connections and inter-league achievements of this league and provide an overview of the community contributions that made this league successful.
The excessive control signaling in Long Term Evolution networks required for dynamic scheduling impedes the deployment of ultra-reliable low latency applications. Semi-persistent scheduling was originally designed for constant bit-rate voice applications, however, very low control overhead makes it a potential latency reduction technique in Long Term Evolution. In this paper, we investigate resource scheduling in narrowband fourth generation Long Term Evolution networks through Network Simulator (NS3) simulations. The current release of NS3 does not include a semi-persistent scheduler for Long Term Evolution module. Therefore, we developed the semi-persistent scheduling feature in NS3 to evaluate and compare the performance in terms of uplink latency. We evaluate dynamic scheduling and semi-persistent scheduling in order to analyze the impact of resource scheduling methods on up-link latency.
Vehicle-to-Everything (V2X) communication promises improvements in road safety and efficiency by enabling low-latency and reliable communication services for vehicles. Besides using Mobile Broadband (MBB), there is a need to develop Ultra Reliable Low Latency Communications (URLLC) applications with cellular networks especially when safety-related driving applications are concerned. Future cellular networks are expected to support novel latencysensitive use cases. Many applications of V2X communication, like collaborative autonomous driving requires very low latency and high reliability in order to support real-time communication between vehicles and other network elements. In this paper, we classify V2X use-cases and their requirements in order to identify cellular network technologies able to support them. The bottleneck problem of the medium access in 4G Long Term Evolution(LTE) networks is random access procedure. It is evaluated through simulations to further detail the future limitations and requirements. Limitations and improvement possibilities for next generation of cellular networks are finally detailed. Moreover, the results presented in this paper provide the limits of different parameter sets with regard to the requirements of V2X-based applications. In doing this, a starting point to migrate to Narrowband IoT (NB-IoT) or 5G - solutions is given.
The next generation cellular networks are expected to improve reliability, energy efficiency, data rate, capacity and latency. Originally, Machine Type Communication (MTC) was designed for low-bandwidth high-latency applications such as, environmental sensing, smart dustbin, etc., but there is additional demand around applications with low latency requirements, like industrial automation, driver-less cars, and so on. Improvements are required in 4G Long Term Evolution (LTE) networks towards the development of next generation cellular networks for providing very low latency and high reliability. To this end, we present an in-depth analysis of parameters that contribute to the latency in 4G networks along with a description of latency reduction techniques. We implement and validate these latency reduction techniques in the open-source network simulator (NS3) for narrowband user equipment category Cat-Ml (LTE-M) to analyze the improvements. The results presented are a step towards enabling narrowband Ultra Reliable Low Latency Communication (URLLC) networks.
Integration of BACNET OPC UA-Devices Using a JAVA OPC UA SDK Server with BACNET Open Source Library
(2014)
Although short range wireless communication explicitly targets local and very regional applications, range continues to be an extremely important issue. The range directly depends on the so called link budget, which can be increased by the choice of modulation and coding schemes. Especially, the recent transceiver generation comes with extensive and flexible support for Software Defined Radio (SDR). The SX127x family from Semtech Corp. is a member of this device class and promises significant benefits for range, robust performance, and battery lifetime compared to competing technologies. This contribution gives a short overview into the technologies to support Long Range (LoRa ™), describes the outdoor setup at the Laboratory Embedded Systems and Communication Electronics of Offenburg University of Applied Sciences, shows detailed measurement results and discusses the strengths and weaknesses of this technology.
In this paper we integrate the ideas of network coding and relays into an existing practical network architecture used in a wireless network scenario. Specifically, we use the COPE architecture to test our ideas. Since previous works have focused on the communication aspect at the physical layer level, we attempt to take it one step further by including the MAC layer. Our idea is based on information theoretic concepts developed by Shannon in order to reliably apply network coding to increase the net throughput.
The suffix-free-prefix-free hash function construction and its indifferentiability security analysis
(2012)
In this paper, we observe that in the seminal work on indifferentiability analysis of iterated hash functions by Coron et al. and in subsequent works, the initial value (IV) of hash functions is fixed. In addition, these indifferentiability results do not depend on the Merkle–Damgård (MD) strengthening in the padding functionality of the hash functions. We propose a generic n-bit-iterated hash function framework based on an n-bit compression function called suffix-free-prefix-free (SFPF) that works for arbitrary IVs and does not possess MD strengthening. We formally prove that SFPF is indifferentiable from a random oracle (RO) when the compression function is viewed as a fixed input-length random oracle (FIL-RO). We show that some hash function constructions proposed in the literature fit in the SFPF framework while others that do not fit in this framework are not indifferentiable from a RO. We also show that the SFPF hash function framework with the provision of MD strengthening generalizes any n-bit-iterated hash function based on an n-bit compression function and with an n-bit chaining value that is proven indifferentiable from a RO.
Um Simulationen schnell und kostengünstig sowie ergebnisträchtig durchzuführen, bedarf es einer sorgfältigen Vorbereitung. Der Artikel befaßt sich mit dem automatischen Ablauf einer Simulationstudie zum Materilafluß und erläutert die im Ablauf notwendigen Beiträge der Anwender. In den Einzelheiten wird über die automatische Vorgehensweise und Anwendung der Standard-Simulationsmodelle (SSM) sowie spezieller Simulationsmodule (SimDBC) berichtet. Entscheidend für eine breitere Anwendung von Simulationswerkzeugen ist die einfache Handhabung und leicht verständliche Definition der zu untersuchenden Szenarien.
Since their dawning, space communications have been among the strongest driving applications for the development of error correcting codes. Indeed, space-to-Earth telemetry (TM) links have extensively exploited advanced coding schemes, from convolutional codes to Reed-Solomon codes (also in concatenated form) and, more recently, from turbo codes to low-density parity-check (LDPC) codes. The efficiency of these schemes has been extensively proved in several papers and reports. The situation is a bit different for Earth-to-space telecommand (TC) links. Space TCs must reliably convey control information as well as software patches from Earth control centers to scientific payload instruments and engineering equipment onboard (O/B) spacecraft. The success of a mission may be compromised because of an error corrupting a TC message: a detected error causing no execution or, even worse, an undetected error causing a wrong execution. This imposes strict constraints on the maximum acceptable detected and undetected error rates.
NEXCODE is a project promoted by the European Space Agency aimed at research design development and demonstration of a receiver chain for telecomm and links in space missions including the presence of new short low-density parity-check codes for error correction. These codes have excellent performance from the error rate viewpoint but also put new challenges as regards synchronization issues and implementation. In this paper after a short review of the results obtained through numerical simulations we present an overview of the breadboard designed for practical testing and the test-plan proposed for the verification of the breadboard and the validation of the new codes and novel synchronization techniques under relevant operation conditions.
The separation of nitrogen and methane from hydrogen-rich mixtures is systematically investigated on a recently developed binder-free zeolite 5A. For this adsorbent, the present work provides a series of experimental data on adsorption isotherms and breakthrough curves of nitrogen and methane, as well as their mixtures in hydrogen. Isotherms were measured at temperatures of 283–313 K and pressures of up to 1.0 MPa. Breakthrough curves of CH4, N2, and CH4/N2 in H2 were obtained at temperatures of 300–305 K and pressures ranging from 0.1 to 6.05 MPa with different feed concentrations. An LDF-based model was developed to predict breakthrough curves using measured and calculated data as inputs. The number of parameters and the use of correlations were restricted to focus on the importance of measured values. For the given assumptions, the results show that the model predictions agree satisfactorily with the experiments under the different operating conditions applied.
In this TDP we describe a new tool created for testing the strategy layer of our soccer playing agents. It is a complete 2D simulator that simulates the games based on the decisions of 22 agents. With this tool, debugging the decision and strategy layer of our agents is much more efficient than before due to various interaction methods and complete control over the simulation.
In the future, the tool could also serve as a measure to run simulations of game series much faster than with the 3D simulator. This way, the impact of different play strategies could be evaluated much faster than before.
In den vergangenen Jahren ist das technische Gas Schwefelhexafluorid (SF6) immer wieder Gegenstand von Diskussionen des Klimaschutzes und der technischen Notwendigkeit für den Betrieb von Schaltanlagen gewesen und wird dies wohl auch noch für längere Zeit bleiben. Das Gas, welches sich aus einem Schwefelatom und sechs Fluoratomen zusammensetzt, wird seit Ende der sechziger 1960er Jahre in Schaltanlagen der Mittel- und Hochspannung eingesetzt. So günstig dessen Eigenschaften im technischen Einsatz auch sind, so klimaschädlich ist es beim Entweichen in die Atmosphäre. SF6 ist das Klimagas mit dem größten bekannten Treibhauspotenzial, es weist ein CO2-Äquivalent (GWP) von 23.900 und eine atmosphärische Lebensdauer von ca. 3.200 Jahren auf. Neben nur wenig verbliebenen Anwendungen in Industrie, Militär und Medizin kommt es heute hauptsächlich bei der elektrischen Energieversorgung als Isolier- und Lichtbogenlöschgas in Schaltanlagen von Übertragungs- und Verteilnetzen zum Einsatz. Grund genug die technische Notwendigkeit, mögliche Alternativen und Konsequenzen drohender Verbote zu diskutieren. In diesem Artikel werden zunächst die Grundlagen moderner SF6- Hochspannungsschaltanlagen vorgestellt, die Klimabelastung durch entweichendes SF6 evaluiert, ein Überblick über den Stand der Forschung gegeben und mögliche Konsequenzen eines Verbotes von Schwefelhexafluorid in der Energieversorgung diskutiert.
Background: Increasing awareness of the importance of evidence-based medicine is demonstrated not only by an increasing number of articles addressing it but also by a specialty-wide evidence-based medicine initiative. The authors critically analyzed the quality of reporting of randomized controlled trials published in this Journal over a 21-year period (1990 to 2010).
Methods: A hand search was conducted, including all issues of Plastic and Reconstructive Surgery from January of 1990 to December of 2010. All randomized controlled trials published during this time period were identified with the Cochrane decision tree for identification of randomized controlled trials. To assess the quality of reporting, a modification of the checklist of the Consolidated Standard of Reporting Trials Statement was used.
Results: Of 7121 original articles published from 1990 to 2010 in the Journal, 159 (2.23 percent) met the Cochrane criteria. A significant increase in the absolute number of randomized controlled trials was seen over the study period (p < 0.0001). The median quality of these trials from 1990 to 2010 was "fair," with a trend toward improved quality of reporting over time (p = 0.127).
Conclusions: A favorable trend is seen with respect to an increased number of published randomized controlled trials in Plastic and Reconstructive Surgery. Adherence to standard reporting guidelines is recommended, however, to further improve the quality of reporting. Consideration may be given to providing information regarding the quality of reporting in addition to the "level of evidence pyramid," thus facilitating critical appraisal.
Structured Innovation with TRIZ in Science and Industry - Creating Value for Customers and Society
(2016)
The design of control systems of concentrator photovoltaic power plants will be more challenging in the future. Reasons are cost pressure, the increasing size of power plants, and new applications for operation, monitoring and maintenance required by grid operators, manufacturers and plant operators. Concepts and products for fixed-mounted photovoltaic can only partly be adapted since control systems for concentrator photovoltaic are considerable more complex due to the required high accurate sun-tracking. In order to assure reliable operation during a lifetime of more than 20 years, robustness of the control system is one crucial design criteria. This work considers common engineering technics for robustness, safety and security. Potential failures of the control system are identified and their effects are analyzed. Different attack scenarios are investigated. Outcomes are design criteria that encounter both: failures of system components and malicious attacks on the control system of future concentrator photovoltaic power plants. Such design criteria are a transparent state management through all system layers, self-tests and update capabilities for security concerns. The findings enable future research to develop a more robust and secure control system for concentrator photovoltaics when implementing new functionalities in the next generation.
The communication system of a large-scale concentrator photovoltaic power plant is very challenging. Manufacturers are building power plants having thousands of sun tracking systems equipped with communication and distributed over a wide area. Research is necessary to build a scalable communication system enabling modern control strategies. This poster abstract describes the ongoing work on the development of a simulation model of such power plants in OMNeT++. The model uses the INET Framework to build a communication network based on Ethernet. First results and problems of timing and data transmission experiments are outlined. The model enables research on new communication and control approaches to improve functionality and efficiency of power plants based on concentrator photovoltaic technology.
Die direkte Vermarktung von Strom aus Wind und Sonne stellt einen wichtigen Schritt der Energiewende dar. Einerseits kann durch die Marktintegration die Unabhängigkeit von EEG-Subventionen gelingen. Andererseits wird über diese Mechanismen die Stromerzeugung an der Nachfrage orientiert, wodurch zur Stabilität des Stromnetzes beigetragen wird. Ein Beispiel dafür ist die lokale Vermarktung von PV-Strom in einem Mietshaus. Für deren Umsetzung benötigen die Akteure ein Mess- und Steuerungssystem, dass vor Ort Zähler- und Anlagendaten erfasst und die Abrechnung der Mieter vereinfacht. Außerdem sollte es Kennwerte wie beispielsweise den PV-Anteil berechnen und gegebenenfalls ein Blockheizkraftwerk steuern. Weder die Zählersysteme der Messstellenbetreiber noch die Steuerungssysteme von PV- oder Blockheizkraftwerken erfüllen diese Anforderungen ausreichend. In der Forschung ist man währenddessen bereits einen Schritt weiter und arbeitet an technischen Systemen, die für wesentlich komplexere Energiesystem- und Markttopologien ausgelegt werden. In dieser Arbeit werden die neuen technischen Anforderungen der Direktvermarktung in einem Mietshaus identifiziert und mit dem Stand aktueller Marktprodukte sowie dem System »OpenMUC« aus der Forschung verglichen.
A new RFID/NFC (ISO 15693 standard) based inductively powered passive SoC (System on chip) for biomedical applications is presented here. The proposed SOC consists of an integrated 32 bit microcontroller, RFID/NFC frontend, sensor interface circuit, analog to digital converter and some peripherals such as timer, SPI interface and memory devices. An energy harvesting unit supplies the power required for the entire system for complete passive operation. The complete chip is realized on CMOS 0.18 μm technology with a chip area of 1.5 mm × 3.0 mm.
In this paper, a complete passive transponder device has been discussed which is meant to monitor leakage in silicone breast implants. The passive tag operates in the HF frequency range of 13.56MHz using RFID ISO 15693 standard. The complete system consists of the transponder, reader and a PC. This paper focusses on the development of such a state of the art passive RFID transponder to monitor the wellness of the silicone breast implants periodically in order to detect leakage in the same. Keyword: RFID (Radio frequency identification device), EM (Electromagnetic) field, Passive Transponder, Silicone breast implants.
In the past two decades much has been published on whiplash injury, yet both the confusion regarding the condition, and the medicolegal discussion about it have increased. In this paper, functional imaging research results are summarized using MRIcroGL3D visualization software and assembled in an image comprising regions of cerebral activation and deactivation.
Electrolyte-Gated Field-Effect Transistors Based on Oxide Semiconductors: Fabrication and Modeling
(2017)
Printed electronics offers certain technological advantages over its silicon based counterparts, such as mechanical flexibility, low process temperatures, maskless and additive manufacturing process, leading to extremely low cost manufacturing. However, to be exploited in applications such as smart sensors, Internet of Things and wearables, it is essential that the printed devices operate at low supply voltages. Electrolyte gated field effect transistors (EGFETs) using solution-processed inorganic materials which are fully printed using inkjet printers at low temperatures are very promising candidates to provide such solutions. In this paper, we discuss the technology, process, modeling, fabrication, and design aspect of circuits based on EGFETs. We show how the measurements performed in the lab can accurately be modeled in order to be integrated in the design automation tool flow in the form of a Process Design Kit (PDK). We also review some of the remaining challenges in this technology and discuss our future directions to address them.
E-Tutoren-Ausbildung: Lernerfahrungen reflektieren – Lehrhandlungskompetenzen dialogisch aufbauen
(2014)
Design of next-generation cdma using orthogonal complementary codes and offset stacked spreading
(2007)
This article presents an innovative code-division multiple access system architecture that is based on orthogonal complementary spreading codes and time-frequency domain spreading. The architecture has several advantages compared to conventional CDMA systems. Specifically, it offers multiple-access-interference-free operation in AWGN channels, reduces co-channel interference significantly, and has the potential for higher capacity and spectral efficiency than conventional CDMA systems. This is accomplished by using an "offset stacked" spreading modulation technique followed by quadrature amplitude modulation, which optimizes performance in a fading environment. This new spreading modulation scheme also simplifies the rate matching algorithms relevant for multimedia services and IP-based applications.
Flashcards are a well known and proven method to learn and memorise. Such a way of learning is perfectly suited for “learning on the way,” but carrying all the flashcards could be awkward. In this scenario, a mobile device (mobile phone) is an adequate solution. The new mobile device operating system Android from Google allows for writing multimedia-enriched applications.
The developed solution enables the presentation of animations and 3D virtual reality (VR) on mobile devices and is well suited for mobile learning, thus creating new possibilities in the area of e-learning worldwide. Difficult relations in physics as well as intricate experiments in optics can be visualised on mobile devices without need for a personal computer.
“Today’s network landscape consists of quite different network technologies, wide range of end-devices with large scale of capabilities and power, and immense quantity of information and data represented in different formats” [9]. A lot of efforts are being done in order to establish open, scalable and seamless integration of various technologies and content presentation for different devices including mobile considering individual situation of the end user. This is very difficult because various kinds of devices used by different users or in different times/parallel by the same user which is not predictable and have to be recognized by the system in order to know device capabilities. Not only the devices but also Content and User Interfaces are big issues because they could include different kinds of data format like text, image, audio, video, 3D Virtual Reality data and upcoming other formats. Language Learning Game (LLG) is such an example of a device independent application where different kinds of devices and data formats, as a content of a flashcard is used for a collaborative learning. The idea of this game is to create a short story in a foreign language by using mobile devices. The story is developed by a group of participants by exchanging sentences/data via a flashcard system. This way the participants can learn from each other by knowledge sharing without fear of making mistakes because the group members are anonymous. Moreover they do not need a constant support from a teacher.
The invention concerns a method for spectrum monitoring a given frequency band, in which the spectral power density (S(f)) within the given frequency band is determined for all noise and signal components in the frequency band and, in order to detect the presence of one or more signals within the given frequency band, it is evaluated whether the spectral power density (S(f)) exceeds a threshold value (&lgr;). According to the invention, the threshold value (&lgr;) is calculated in accordance with an estimation of a distribution density (hR(S)) for the noise component of the spectral power density (S(f)) within the given frequency band and in accordance with a predefined value for the false-alarm probability (Pfa).
Die Erfindung betrifft ein Verfahren zum Spektrum-Monitoring eines vorgegebenen Frequenzbandes, bei dem die spektrale Leistungsdichte (S(f)) innerhalb des vorgegebenen Frequenzbandes für alle in dem Frequenzband enthaltenen Rausch- und Signalanteile bestimmt wird und für das Detektieren des Vorhandenseins eines oder mehrerer Signale innerhalb des vorgegebenen Frequenzbandes das Überschreiten eines Schwellenwertes (λ) durch die spektrale Leistungsdichte (S(f)) ausgewertet wird. Erfindungsgemäß wird der Schwellenwert (λ) abhängig von einer Schätzung einer Verteilungsdichte (hR(S)) für den Rauschanteil der spektralen Leistungsdichte (S(f)) innerhalb des vorgegebenen Frequenzbandes und einem vorgegebenen Wert für die Falschalarmwahrscheinlichkeit (Pfa) berechnet.
Die Erfindung betrifft ein Verfahren zur automatischen Klassifikation des Modulationsformats eines digital modulierten Signals, welches die empfangenen I/Q-Datenpunkte zuerst für jedes Modulationsformat mittels eines Clustering-Verfahrens ausgewertet, wobei nach Durchführung des Clustering-Verfahrens für jedes der Modulationsformate jeweils alle I/Q-Datenpunkte jeweils einem ermittelten Cluster-Schwerpunkt zugeordnet sind. Danach wird für jedes Modulationsformat jeweils der Wert einer Nutzenfunktion bestimmt, welche einen umso höheren (niedrigeren) Wert annimmt, je besser die einem Cluster-Schwerpunkt zugeordneten I/Q-Datenpunkte durch den Cluster-Schwerpunkt abgedeckt sind und je geringer die euklidischen Abstände der ermittelten Custer-Schwerpunkte von dem zugeordneten Konstellationspunkt sind. Es wird dann dasjenige Modulationsformat als das für das digital modulierte Signal zutreffende Modulationsformat angenommen, für welche die Nutzenfunktion den höchsten (niedrigsten) Wert annimmt.
Der Übergang Schule-Studium wird an der Hochschule Offenburg im Vorbereitungskurs Mathematik per Smartphone bzw. Tablet unterstützt. Eine Mathe-App gibt zu den Trainingsaufgaben bei Bedarf Tipps, Teilschritte und ausführliche Erklärungen und hilft so den Studierenden, die Lösungen in ihrer individuellen Lerngeschwindigkeit zu entwickeln. Der mobile Ansatz erlaubt, die ca. 400 Teilnehmer des Präsenz-Kurses in normalen Klassenräumen ohne PC-Ausstattung mit E-Learning vertraut zu machen und unterstützt die Flexibilisierung von Übungszeit und -ort über die Präsenzzeit hinaus. Durch die inhaltliche Orientierung am hochschulübergreifenden COSH (Cooperation Schule Hochschule) Mindestanforderungskatalog Mathematik entstand eine Lösung, die jedem Studienanfänger zur Vorbereitung auf das Studium nutzen kann, die zu den Brückenkurs-Inhalten vieler Hochschulen passt und für die aktuell schon Kooperationsprojekte mit Schulen starten.
We present a novel scheme for Slotted ALOHA random access systems that combines physical-layer network coding (PLNC) with multiuser detection (MUD). The PLNC and MUD are applied jointly at the physical layer to be able to extract any linear combination of messages experiencing a collision within a slot. The set of combinations extracted from a whole frame is then processed by the receiver to recover the original packets. A simple precoding stage at the transmitting terminals allows the receiver to further decrease the packet loss rate. We present results for the decoding at the physical layer as well as several performance measures at frame level, namely, throughput, packet loss rate, and energy efficiency. The results we present are promising and suggest that a cross-layer approach leveraging on the joint use of PLNC and MUD can significantly improve the performance of random access systems in the presence of slow fading.
Die Einhaltung der innerhalb der Designphase festgelegten Architektur eines Softwareprojektes muss w ̈ahrend der Entwicklungsphase sichergestellt werden. Dieses Papier beschreibt eine Erweiterung des Eclipse-Plugins JDepend4Eclipse, die die Verwaltung von Regels ̈atzen erlaubt und die Pr ̈ufung auf in einem Projekt vorhandene, unerlaubte Abh ̈angigkeiten auf Knopfdruck innerhalb der Entwicklungsumgebung vornimmt. Die Erweiterung des Plugins wird bereits erfolgreich in internen Projekten der Hochschule Offenburg eingesetzt und soll demn ̈achst ̈offentlich verf ̈ugbar sein.
This paper presents an overview of the coding aspects of a GNSS receiver. Coding allows detection and correction of channel-induced errors at the receiver, here the focus is on the mitigation of threats from malicious interferences. Although the effects of interference at different stages of GNSS baseband processing has been deeply analyzed in the literature, little attention was devoted to its impact on the navigation message decoding stage. Theis paper provides an introduction to the various coding schemes employed by current GNSS signals, discussing their performance in the presence of noise in terms of block-error rate. Additionally, the benefits of soft-decoding schemes for navigation message decoding are highlighted when jamming interferences are present. The proposed scheme requires estimating the noise plus interference power, yielding to enhanced decoding performances under severe jamming conditions. Finally, cryptographic schemes as a means of providing anti-spoofing for geosecurity location-based services, and their potential vulnerability are discussed, with particular emphasis on the dependence on the dependence of the scheme on successful navigation message decoding
In the brain-cell microenvironment, diffusion plays an important role: apart from delivering glucose and oxygen from the vascular system to brain cells, it also moves informational substances between cells. The brain is an extremely complex structure of interwoven, intercommunicating cells, but recent theoretical and experimental works showed that the classical laws of diffusion, cast in the framework of porous media theory, can deliver an accurate quantitative description of the way molecules are transported through this tissue. The mathematical modeling and the numerical simulations are successfully applied in the investigation of diffusion processes in tissues, replacing the costly laboratory investigations. Nevertheless, modeling must rely on highly accurate information regarding the main parameters (tortuosity, volume fraction) which characterize the tissue, obtained by structural and functional imaging. The usual techniques to measure the diffusion mechanism in brain tissue are the radiotracer method, the real time iontophoretic method and integrative optical imaging using fluorescence microscopy. A promising technique for obtaining the values for characteristic parameters of the transport equation is the direct optical investigation using optical fibers. The analysis of these parameters also reveals how the local geometry of the brain changes with time or under pathological conditions. This paper presents a set of computations concerning the mass transport inside the brain tissue, for different types of cells. By measuring the time evolution of the concentration profile of an injected substance and using suitable fitting procedures, the main parameters characterizing the tissue can be determined. This type of analysis could be an important tool in understanding the functional mechanisms of effective drug delivery in complex structures such as the brain tissue. It also offers possibilities to realize optical imaging methods for in vitro and in vivo measurements using optical fibers. The model also may help in radiotracer biomarker models for the understanding of the mechanism of action of new chemical entities.
In Verbindung mit geeigneten Sensoren, können Korrelatoren aus völlig regellosen Signalströmen hochpräzise Daten gewinnen. Diese Meßgeräte werden zur Messung von Durchflüssen, Mengenströmen und Geschwindigkeiten benötigt. Durch die Geschwindigkeitssteigerung bei Rechnern und deren Preisverfall werden Korrelatoren nicht mehr wie in der Vergangenheit nur im Bereich der Forschung und der Entwicklung eingesetzt, sondern in zunehmendem Maße auch als Betriebsmeßgeräte.
Das in diesem Beitrag beschriebene Meßsystem besteht aus einem Steuerteil zur Eingabe der Befehle, einem digitalen Display zur Anzeige der Meßwerte und einer davon stationierten Sonde, welche die Meßwerte aufnimmt und digitalisiert. Beide Teile sind über zwei Lichtwellenleiter miteinander verbunden. Die zur Versorgung nötige Energie und die Steuersignale zur Meßsonde überträgt einer der beiden Lichtwellenleiter. Eine Laserdiode in der Steuereinheit liefert dazu eine optische Ausgangsleistung, die in die Faser eingekoppelt wird. In der Meßsonde wandelt ein 'Power-Converter' die optische Leistung wieder in elektrische Energie zurück. Über den zweiten Lichtwellenleiter sendet die Meßsonde die aufgenommenen Daten an das Steuergerät. In der Meßsonde sitzt ein Mikrorechner, der die Signale erfaßt, digitalisiert und an die Steuereinheit sendet. Dort ermittelt der andere Mikrorechner daraus die Meßgröße und zeigt sie auf einem LC-Display an.
Im Rahmen des Kontinentalen Tiefbohrprogramms der Bundesrepublik Deutschland (KTB) wurde im September 1987 in der Oberpfalz bei Windischeschenbach mit der Vorbohrung begonnen, die im Frühjahr des letzten Jahres bei einer Tiefe von 4000m erfolgreich abgeschlossen wurde. Mit der in diesem Jahr beginnenden Hauptbohrung will man 10 bis 12km tief in das Erdinnere vorstoßen. Nicht nur anhand von Gesteins- und Flüssigkeitsproben, sondern auch mit Hilfe von Meßsonden werden umfangreiche geophysikalische Daten gewonnen. Der Aufsatz beschreibt, wie die magnetische Suszeptibilität von Gesteinen bis ca. 300°C Umgebungstemperatur und einem Druck von 2kbar gemessen wird und die Daten über ein 14km langes Bohrlochkabel von der Sonde zum Steuerrechner übertragen werden.
Wahlweise Lichtleiter oder Zweidraht-Leitungen: Ringbus-System zur flexiblen Messdatenerfassung
(1986)
In diesem Beitrag wird ein Messdatenerfassungssystem vorgestellt, bei dem die Datenuebertragung je nach den Anforderungen entweder auf Zweidrahtleitung oder via Lichtleiter erfolgen kann. Es handelt sich um ein ringfoermiges serielles Bussystem fuer das eine universelle Interfacekarte mit Mikroprozessor aufgebaut wurde. Nach Beschreibung der Struktur des Systems wird auf den Uebertragungscode und das Datenprotokoll eingegangen. Anschliessend wird der Aufbau der Interfacekarte gezeigt. Einige Bemerkungen zu den Software-Modulen fuer die Interfacekarte, zum Programmablauf auf ihr und zur Synchronisierung der Interface-Karte runden den Beitrag ab.
Cardiac resynchronization therapy (CRT) with hemodynamic optimized biventricular pacing is an established therapy for heart failure patients with sinus rhythm, reduced left ventricular ejection fraction and wide QRS complex. The aim of the study was to evaluate electrical right and left cardiac atrioventricular delay and left atrial delay in CRT responder and non-responder with sinus rhythm.
Methods: Heart failure patients with New York Heart Association class 3.0 ± 0.3, sinus rhythm and 27.7 ± 6.1% left ventricular ejection fraction were measured by surface ECG and transesophageal bipolar left atrial and left ventricular ECG before implantation of CRT devices. Electrical right cardiac atrioventricular delay was measured between onset of P wave and onset of QRS complex in the surface ECG, left cardiac atrioventricular delay between onset of left atrial signal and onset of left ventricular signal in the transesophageal ECG and left atrial delay between onset and offset of left atrial signal in the transesophageal ECG.
Results: Electrical atrioventricular and left atrial delay were 196.9 ± 38.7 ms right and 194.5 ± 44.9 ms left cardiac atrioventricular delay, and 47.7 ± 13.9 ms left atrial delay. There were positive correlation between right and left cardiac atrioventricular delay (r = 0.803 P < 0.001) and negative correlation between left atrial delay and left ventricular ejection fraction (r = −0.694 P = 0.026) with 67% CRT responder.
Conclusions: Transesophageal electrical left cardiac atrioventricular delay and left atrial delay may be useful preoperative atrial desynchronization parameters to improve CRT optimization.
In einer Vorlesung nicht abgehängt zu werden und die vielen Ergebnisse strukturiert zu sichern, ist für Studienanfänger eine große Herausforderung. Mitschriebe sind sehr oft unvollständig, unstrukturiert oder „zerfläddert“. Mitschreib-Marathon und Mitdenken schließen sich bei vielen aus. Auch aktivierende Lehrmethoden, Medienwechsel, Lehrvideos führen oft dazu, dass eine strukturierte Sicherung der Inhalte des Lehrgesprächs noch erschwert wird.
Es wird ein Best Practice Beispiel gezeigt, Mathematik-Vorlesungen über ein Tablet-basiertes Mitmach-Skript zu gestalten. Dieses dient als Schrittmacher zwischen Input- und Verarbeitungsphasen und unterstützt die strukturierte Verschriftlichung, indem es Vorteile von Tafel, PPT und klassischem Skript vereint. Traditionelle Methoden werden mit technologischen Möglichkeiten kombiniert, um die angesprochenen Herausforderungen bewusster im Lehrstil zu berücksichtigen. Verbindungen zu Virtual Classroom und Video-gestützter Lehre werden aufgezeigt.
Die Heterogenität der Studienanfänger/innen erleben viele Lehrende unmittelbar in den Anfängerveranstaltungen, Heterogenität nicht nur in Bezug auf fachliche Vorbildung, sondern auch bezüglich verfügbaren Lernstrategien, Fertigkeiten, Motivation und
Selbstdisziplin. Schon allein einer 90-minütigen Vorlesung konzentriert zu folgen und die
Ergebnisse strukturiert zu sichern, ist für viele eine sehr große Herausforderung. In diesem Erfahrungsbericht wird das seit dem WS 2015/16
an der Hochschule Offenburg erprobte Potenzial moderner Tablets untersucht, Vorteile
von klassischem handschriftlichen An- und Mitschreiben mit einer Vorstruktur, wie sie
z.B. PPT-Slides ermöglichen, zu vereinen.
Gelingt den Studienanfängern eine strukturierte Ergebnissicherung der Lehrveranstaltungen? Oder sind viele allein schon mit einem vollständigen Mitschrieb überfordert? Laufen aktivierende Methoden, Medienwechsel und der Wunsch nach strukturierter Sicherung der Inhalte des Lehrgesprächs teilweise sogar auseinander? Immer mehr Studierende (möchten) auch per Tablet mitschreiben. Wie könnte man in der Lehre auf diese Aspekte stärker eingehen?
Es wird ein praktischer Ansatz gezeigt, Mathematik-Vorlesungen über ein Tablet-basiertes Mitmach-Skript zu gestalten. Dieses dient als Schrittmacher zwischen Input- und Verarbeitungsphasen und unterstützt die strukturierte Verschriftlichung, indem es Vorteile von Tafel, PPT und klassischem Skript vereint. Traditionelle Methoden werden mit technologischen Möglichkeiten kombiniert, um die angesprochenen Herausforderungen bewusster im Lehrstil zu berücksichtigen. Verbindungen zu Virtual Classroom und Video-gestützter Lehre werden aufgezeigt.
Der Übergang Schule-Studium wird an der Hochschule Offenburg im Vorbereitungskurs Mathematik per Smartphone bzw. Tablet unterstützt. Eine Mathe-App gibt zu den Trainingsaufgaben bei Bedarf Tipps und Teilschritte und hilft so den Studierenden, gemäß individuellem Tempo und Vorkenntnisstand zu arbeiten. Dies fördert eine Aktivierung der Kursteilnehmer auch bei großer Heterogenität. Der mobile Ansatz erlaubt, die ca. 400 Teilnehmer des Präsenz-Kurses in normalen Klassenräumen ohne PC-Ausstattung mit einem eCoach zu unterstützen und ermöglicht die Flexibilisierung von Übungszeit und –Ort über die Präsenzzeit hinaus.
Dieser für die heutige junge Generation attraktive Lernzugang entstand als Kooperationsprojekt zwischen der MassMatics UG und der Hochschule Offenburg. Durch die inhaltliche Orientierung am hochschulübergreifenden Mindestanforderungskatlog Mathematik des cosh-Arbeitskreises entstand eine Lösung, die jedem Studienanfänger zur Vorbereitung auf das Studium nutzen kann (auch ohne Präsenzkurs), die zu den Brückenkurs-Inhalten vieler Hochschulen passt und für die aktuell schon Kooperationsprojekte mit Schulen starten.
Das Projekt MINT-College TIEFE (Talente Individuell, Erfolgreich Fördern und Entwickeln) der Hochschule Offenburg wird im Rahmen des Bund-Länder-Programms Qualitätspakt Lehre aus Mitteln des Bundesministeriums für Bildung und Forschung (BMBF) unter dem Förderkennzeichen 01PL11016 gefördert. Unterstützt wird das Projekt vom Informationszentrum der Hochschule Offenburg.
Gaps in basic math knowledge are among the biggest obstacles to a successful start in university. Students starting their studies in STEM disciplines display significant diversity, “math anxiety” is a widespread phenomenon, and the transition to a self-determined way of studying presents a huge challenge. Universities offer support measures such as preparatory courses. Over the years, Offenburg University realized that with increased diversity, traditional ways of teaching in front of the class have become inefficient. The majority of the students remained inactive and just listened to the teachers’ explanations and the few active participants’ answers.
Since 2013 our new course concept fosters a shift from teaching to active learning on a large scale, involving several hundred participants of our on-site preparatory math courses. This switch to broad active practicing, however, must go hand in hand with providing individual support for an increasingly diverse student body. Meanwhile students bring along their mobile devices, and the training App TeachMatics serves as a facilitator. The course concept has been very well received by both students and teachers.
Smartphones Welcome! Preparatory Course in Mathematics using the Mobile App MassMatics. Case Study
(2015)
Existing approaches solving multi-vehicle pickup and delivery problems with soft time windows typically use common benchmark sets to verify their performance. However, there is a gap from these benchmark sets to real world problems with respect to instance size and problem complexity. In this paper we show that a combination of existing approaches together with improved heuristics is able to deal with the instance sizes and complexity of real world problems. The cost savings potential of the heuristics is compared to human dispatching plans generated from the data of a European carrier.
This paper describes the new Sweaty II humanoid adult size robot trying to qualify for the RoboCup 2016 adult size humanoid competition. Based on experiences during RoboCup 2014, the Sweaty robot has been completely redesigned to a new robot Sweaty II. A major change is the use of linear actuators for the legs. Another characteristic is its indirect actuation by means of rods. This allows a variable transmission ratio depending on the angle of a joint.
This paper describes the new Sweaty humanoid adult size robot trying to qualify for the RoboCup 2014 adult size humanoid competition. The robot is built from scratch to eventually allow it to run. One characteristic is that to prevent the motors from overheating, water evaporation is used for cooling. The robot is literally sweating which has given it its name. Another characteristic is, that the motors are not directly connected to the frame but by means of beams. This allows a variable transmission ratio depending on the angle.
This paper describes the Sweaty II humanoid adult size robot trying to qualify for the RoboCup 2017 adult size humanoid competition. Sweaty came 2nd in RoboCup 2016 adult size league. The paper describes the main characteristics of Sweaty that made this success possible, and improvements that have been made or are planned to be implemented for RoboCup 2017.
Private households constitute a considerable share of Europe's electricity consumption. The current electricity distribution system treats them as effectively passive individual units. In the future, however, users of the electricity grid will be involved more actively in the grid operation and can become part of intelligent networked collaborations. They can then contribute the demand and supply flexibility that they dispose of and, as a result, help to better integrate renewable energy in-feed into the distribution grids.
Experimental and theoretical investigations of the time of equalization of the concentration of an impurity in a rectangular flow‐type chamber have been carried out. It has been shown that the process of equalization of the concentration with time is exponential in character. The characteristic equalization time has been computed using the theory of turbulent diffusion. Theoretical results describe experimental regularities with an accuracy of about 10%. The value of the coefficient of turbulent diffusion for different configurations of flows in the chamber has been obtained from a comparison of experimental and calculated results.
The Humboldt digital library (HDL) represents an innovative system to access the works and legacy of Alexander von Humboldt in a digital form on the Internet (www.avhumboldt.net). It contributes to the key question about how to present interconnected data in an appropriate form using information technologies. The HDL has been created as a dynamic digital library with the capability of connecting multilingual and multimedia data from diverse online archives. Humboldt’s volumes have become available, but beyond that any relevant information related to the observations of Humboldt, even outside the works can become immediately accessible. This makes it possible to recognize natural changes and compare Humboldt’s descriptions with recent situations. The technology we have developed addresses the issues of sustainability and makes it possible to detect changes in the environment since the time of Humboldt’s observations.
In their famous work on prospect theory Kahneman and Tversky have presented a couple of examples where human decision making deviates from rational decision making as defined by decision theory. This paper describes the use of extended behavior networks to model human decision making in the sense of prospect theory. We show that the experimental findings of non-rational decision making described by Kahneman and Tversky can be reproduced using a slight variation of extended behavior networks.
In this paper we show that a model-free approach to learn behaviors in joint space can be successfully used to utilize toes of a humanoid robot. Keeping the approach model-free makes it applicable to any kind of humanoid robot, or robot in general. Here we focus on the benefit on robots with toes which is otherwise more difficult to exploit. The task has been to learn different kick behaviors on simulated Nao robots with toes in the RoboCup 3D soccer simulator. As a result, the robot learned to step on its toe for a kick that performs 30% better than learning the same kick without toes.
This paper describes the magmaOffenburg 3D simulation team trying to qualify for RoboCup 2009. It focuses on two distinctive features of the team: decisions making using extended behavior networks and its software architecture and implementation in Java to open the simulation for the Java community.
This paper describes the magmaOffenburg 3D simulation team trying to qualify for RoboCup 2010. While last year’s TDP focused on decisions making using extended behavior networks and on its software architecture and implementation in this year we describe the tool set that was created for RoboCup 3D. It contians a GUI for agent- and world state visualization, for evaluation of localization algorithms and benchmarks in general, a visual editor for Extended Behavior Networks creation and debugging, a live movement tool to interact with the joints and finally a tool for editing behavior motor files.
After having described many different aspects of our team software in previous years, in this paper we take the freedom to describe the magmaChallenge framework provided by the magmaOffenburg team. The framework is used as a benchmark tool to run different challenges like the running challenge in 2014 or the kick accuracy challenge in 2015. This description should serve as a documentation to simplify the maintenance by the community and to add new benchmarks in the future.