Refine
Year of publication
Document Type
- Conference Proceeding (926)
- Article (reviewed) (551)
- Article (unreviewed) (124)
- Part of a Book (65)
- Contribution to a Periodical (58)
- Book (29)
- Patent (29)
- Letter to Editor (28)
- Doctoral Thesis (19)
- Working Paper (18)
Conference Type
- Konferenzartikel (730)
- Konferenz-Abstract (134)
- Sonstiges (34)
- Konferenz-Poster (22)
- Konferenzband (8)
Language
- English (1854) (remove)
Is part of the Bibliography
- yes (1854) (remove)
Keywords
- RoboCup (32)
- Dünnschichtchromatographie (26)
- Gamification (17)
- Machine Learning (17)
- Export (15)
- Kommunikation (15)
- TRIZ (13)
- Plastizität (12)
- 3D printing (11)
- Deep Leaning (11)
Institute
- Fakultät Maschinenbau und Verfahrenstechnik (M+V) (561)
- Fakultät Elektrotechnik und Informationstechnik (E+I) (bis 03/2019) (486)
- Fakultät Elektrotechnik, Medizintechnik und Informatik (EMI) (ab 04/2019) (357)
- Fakultät Wirtschaft (W) (255)
- INES - Institut für nachhaltige Energiesysteme (165)
- Fakultät Medien und Informationswesen (M+I) (bis 21.04.2021) (146)
- ivESK - Institut für verlässliche Embedded Systems und Kommunikationselektronik (142)
- IMLA - Institute for Machine Learning and Analytics (71)
- ACI - Affective and Cognitive Institute (58)
- Fakultät Medien (M) (ab 22.04.2021) (51)
Open Access
- Open Access (798)
- Closed Access (625)
- Closed (241)
- Bronze (136)
- Gold (72)
- Diamond (63)
- Hybrid (44)
- Grün (12)
An investigation is underway regarding the usefulness of altazimuth-mounting telescopes' incorporation of laser gyros for pointing and fiber gyros with extremely small random-walk coefficient for telescope inertial stabilization during tracking. A star tracker is expected to help stabilize long-term gyro bias. Gyro and telescope specifications have been derived by means of computer simulations and systems analyses.
A polarization mode dispersion measurement set-up based on a Mach-Zehnder Interferometer was realized. Measurements were carried out on short high-birefringent fibers and on long standard telecommunication single-mode fibers. In order to ensure high accurate results, special emphasis was placed on the evaluation of the interference pattern. The procedure will be described in detail and practical measurement results will be presented.
The prototype of an optical gyro encoder (OGE) has been successfully tested on the NTT telescope in September '93. The OGE consists of a ring laser gyro and a fiber optic gyro with their input axis parallel. The gyro outptu signals are compensated for earth rotation and misalignment and are subsequently integrated to get the angles. An adaptive digital control loop locks the fiber optic gyro to the laser gyro data. Thus the combined output has the precision of the laser gyro and the low noise of the fiber optic gyro. Specifically, the bias stability is better than 2 X 10-3 deg/h, the scale factor accuracy better than 1 ppm, the random walk coefficient better than 5 X 10-4 deg/(root)h and the resolution better than 3 X 10-4 arcsec. The OGE has been mounted in the altitude and in the azimuthy axis of the telescope. The data were compared with the telescope disk encoder data. The test data show that the pointing accuracy is about 1 arcsec and the tracking accuracy 0.1 arcsec over a time of 30 seconds. This accuracy is sufficient for the very large telescope, for instance.
This paper treats the Brillouin backscattering in a single mode optical fiber and its implications on the Brillouin Ring Laser Gyroscope (BRLG). The BRLG consists of a fiber ring cavity in which stimulated Brillouin scattering is induced and provides two resonant counterpropagating backscattered waves. If this cavity is rotating around its axis, the backscattered waves get different resonant frequencies because of the Sagnac effect. The frequency difference is proportional to the rotation rate (Omega) by inducing a frequency offset between the counterpropagating waves. Some reported Brillouin spectra exhibit several peaks, which means that one pump wave provides at least two backscattered waves with distinguishable frequencies. In order to understand this multi-backscattering and to take advantage of it for the BRLG, we present results of a simulation of the Brillouin backscattering in a single mode optical fiber.
Vortex breakdown phenomena in rotating fluids are investigated both theoretically and experimentally. The fluid is contained in a cone between two spherical surfaces. The primary swirling motion is induced ba the rotating lower boundary. The upper surface can be fixed with non-slip condition or can be a stress-free surface. Depending on these boundary conditions and on the Reynolds number, novel structures of recirculation zones are realized. The axisymmetric flow patterns are simulated numerically by a finite difference method. Experiments are done to visualize the topological structure of the flow pattern and to observe the existence ranges of the different recirculating flows. The comparison between theory and experiment shows good agreement with respect to the topological structure of the flow.
This paper treats the interaction between acoustic modes and light (Brillouin scattering) in a single mode optical fibre. Different observed spectra of the Brillouin backscattering in several fibres have been already reported. In order to have a clear idea of the process, we made a simulation to be able to `draw' the theoretical Brillouin spectrum of an optical fibre and to identify the origin of the observed backscattered lines.
First, the model and the computation method used in our simulation are described. Second, the experimentally observed spectra of two real fibres are compared with their computed spectra. Real spectra and simulated spectra are in good agreement.
Our work provides an interesting tool to investigate the changes in the Brillouin spectrum when the input parameters (characteristics of an optical fibre) vary. This should give useful indications to people working on systems which use Brillouin backscattering.
Formal verification (FV) is considered by many to be complicated and to require considerable mathematical knowledge for successful application. We have developed a methodology in which we have added formal verification to the verification process without requiring any knowledge of formal verification languages. We use only finite-state machine notation, which is familiar and intuitive to designers. Another problem associated with formal verification is state-space explosion. If that occurs, no result is returned; our method switches to random simulation after one hour without results, and no effort is lost. We have compared FV against random simulation with respect to development time, and our results indicate that FV is at least as fast as random simulation. FV is superior in terms of verification quality, however, because it is exhaustive.
The flow field-flow fractionation (FIFFF) technique is a promising method for separating and analysing particles and large size macromolecules from a few nanometers to approximately 50 μm. A new fractionation channel is described featuring well defined flow conditions even for low channel heights with convenient assembling and operations features. The application of the new flow field-flow fractionation channel is proved by the analysis of pigments and other small particles of technical interest in the submicrometer range. The experimental results including multimodal size distributions are presented and discussed.
An algorithm is presented that has successfully been utilized in practice for several years. It improves data analysis in chromatography. The program runs in an extremely reliable way and evaluates chromatographic raw data with an acceptable error. The algorithm requires a minimum of preliminaries and integrates even unsmoothed noisy data correctly.
We generalize the fluid flow problem of an oscillating flat plate (II. Stokes problem) in two directions. We discuss first the oscillating porous flat plate with superimposed blowing or suction. The second generalization is concerned with an increasing or decreasing velocity amplitude of the oscillating flat plate. Finally we show that a combination of both effects is possible as well.
Shapes and structures of vortex breakdown phenomena in rotating fluids are visualized. We investigate the flow in a cylindrical container and in a cone between two spherical surfaces. The primary swirling flow is induced by the rotating upper disk in the cylindrical case and by the lower boundary in the spherical case. The upper surface can be fixed with a no slip condition or can be a stress-free surface. Depending on these boundary conditions and on the Reynolds number novel structures of recirculation zones are realized. Experiments are done to visualize the topological structure of the flow and to determine their existence range as function of the geometry and rotation rate. A comparison between the experimental and theoretical approach shows a good agreement in respect to the topological structures of the flows.
Rotating flow systems are often used to study stability phenomena and structure developments. The closed spherical gap problem is generalized into an open flow system by superimposing a mass flux in meridional direction. The basic solutions at low Reynolds numbers are described by analytical methods. The nonlinear supercritical solutions are simulated numerically and realized in experiments. Novel steady and time-dependent modes of flows are obtained. The extensive results concern the stability behaviour, non-uniqueness of supercritical solutions, symmetry behaviour and transitions between steady and time-dependent solutions. The experimental investigations concern the visualization of the various instabilities and the quatitative description of the flow structures including the laminar-turbulent transition. A comparison between theoretical and experimental results shows good agreement within the limit of rotational symmetric solutions from the theory.
In this paper a high-performance thin-layer chromatography (HPTLC) scanner is presented in which a special fibre arrangement is used as HPTLC plate scanning interface. Measurements are taken with a set of 50 fibres at a distance of 400 to 500 μm above the HPTLC plate. Spatial resolutions on the HPTLC plate of better than 160 μm are possible. It takes less than 2 min to scan 450 spectra simultaneously in a range of 198 to 610 nm. The basic improvement of the item is the use of highly transparent glass fibres which provide excellent transmission at 200 nm and the use of a special fibre arrangement for plate illumination and detection.
A system for the on-line/in-line measurement of soot particle sizes and concentrations in the undiluted exhaust gas of diesel engines was developed and successfully tested. The unit uses the individual attenuations of three different laser wavelengths and is combined with an optical cell (white principle) with adjustable path lengths from 2.5 to 15 meters.
The title expresses goals the Kansas Geological Survey (KGS) has been working toward for some time. This report extends concepts and objectives developed while working on an earlier effort for effective interactive digital maps on the Internet. That work was reported to the 1998 DMT Workshop in Champaign, Illinois (Ross, 1998). The current project goes beyond previous efforts that focused on methods for serving the contents of a geographic information system (GIS); the points, lines, and polygons representing features of the digital geologic map and the data in the attribute tables of the GIS describing those features.
The importance of obtaining simultaneous particle size and concentration values has grown up with continuing discussion of the health effects, of internal combustion engine generated particulate emissions and in particular of Diesel soot emissions. In the present work an aerosol measurement system is described that delivers information about particle size and concentration directly from the undiluted exhaust gas.
Using three laser diodes of different wavelengths which form one parallel light beam, each spectral attenuation is analysed by a single detector and the particle diameter and concentration is evaluated by the use of the Mie theory and shown on-line at a frequency of 1 Hz. The system includes an optical long-path-cell (White principle) with an adjustable path length from 2.5 to 15 m, which allows the analysis within a broad concentration range.
On-line measurements of the particulate emissions in the hot, undiluted exhaust of Diesel engines are presented under stationary and transient engine’s load conditions. Mean particle diameters well below 100 nm are detected for modern Diesel engines. The measured particle concentration corresponds excellently with the traditional gravimetrical measurements of the diluted exhaust. Additionally, measurements of particle emissions (mostly condensed hydricarbons) from a two-stroke engine are presented and discussed.
A prototype multiwavelength sensor able to characterise soot emissions in Diesel exhaust in terms of size and concentration has been tested against other methods for diesel particle measurements like electrical mobility sizing (SMPS) and raw exhaust gravimetric sampling (RES). Measurements carried out with the prototype sensor were correlated with the SMPS by assuming spherical and/or fractal aggregate morphology of the particles. Correlation of RES gravimetric data against the sensor and the SMPS led to the calculation of the solid density for soot particles to be 2.3 gr/cm3.
HPTLC (High Performance Thin Layer Chromatography) is a well known and versatile separation method which shows many advantages when compared to other separation techniques. The method is fast and inexpensive and does not need time-consuming pretreatments. For visualisation of the sample distribution on a HPTLC-plate we developed a new and sturdy HPTLC-scanner. The scanner allows simultaneous registrations of spectra in a range from 198 nm to 612 nm with a spectral resolution of better than 0.8 nm. The on-plate spatial resolution is better than 160 μm. The measurement of 450 spectra in one separation track does not need more than two minutes. The new diode-array scanner offers a fast survey over a TLC-separation and makes various chemometric applications possible. For compound identification a cross-correlation function is described to compare UV sample spectra with appropriate library data. The cross-correlation function herein described can also be used for purity testing. Unresolved peaks can be virtually separated by use of a least squares fit algorithm. In summary, the diode arry system delivers much more information than the commonly used TLC-scanner.
High performance thin layer chromatography (HPTLC) is a frequently used separation technique which works well for quantification of caffeine and quinine in beverages. Competing separation techniques, e.g. high-performance liquid chromatography (HPLC) or gas chromatography (GC), are not suitable for sugar-containing samples, because these methods need special pretreatment by the analyst. In HPTLC, however, it is possible to separate ‘dirty’ samples without time-consuming pretreatment, because disposable HPTLC plates are used. A convenient method for quantification of caffeine and quinine in beverages, without sample pretreatment, is presented below. The basic theory of in-situ quantification in HPTLC by use of remitted light is introduced and discussed. Several linearization models are discussed.
A home-made diode-array scanner has been used for quantification; this, for the first time, enables simultaneous measurements at different wavelengths. The new scanner also enables fluorescence evaluation without further equipment. Simultaneous recording at different wavelengths improves the accuracy and reliability of HPTLC analysis. These aspects result in substantial improvement of in-situ quantitative densitometric analysis and enable quantification of compounds in beverages.
A new diode-array scanner in combination with a computer-controlled application system meets all the demands of modern HPTLC measurement. Automatic application, simultaneous measurements at different wavelengths, and different linearization models enable appropriate evaluation of all analytical questions. The theory of error propagation recommends quantification at reflectance values smaller than 0.8; this can be verified only by use of diode-array scanning. The same theory also recommends quantification by use of peak height data, because the theory predicts best precision only for peak height evaluation. Diode-array scanning with reflectance monitoring enables appropriate validation in TLC and HPTLC analysis. All these aspects result in substantial improvement of in-situ quantitative densitometric analysis, and simultaneous recording at different wavelengths opens the way for chemometric evaluation, e.g. peak purity monitoring, which improves the accuracy and reliability of HPTLC analysis.
A systematic toxicological analysis procedure using high-performance thin layer chromatography in combination with fibre optical scanning densitometry for identification of drugs in biological samples is presented. Two examples illustrate the practicability of the technique. First, the identification of a multiple intake of analgesics: codeine, propyphenazone, tramadol, flupirtine and lidocaine, and second, the detection of the sedative diphenhydramine. In both cases, authentic urine specimens were used. The identifications were carried out by an automatic measurement and computer-based comparison of in situ UV spectra with data from a compiled library of reference spectra using the cross-correlation function. The technique allowed a parallel recording of chromatograms and in situ UV spectra in the range of 197–612 nm. Unlike the conventional densitometry, a dependency of UV spectra by concentration of substance in a range of 250–1000 ng/spot was not observed.
The aim of this study was to develop a biomechanically validated finite element model to predict the biomechanical behaviour of the human lumbar spine in compression.
For validation of the finite element model, an in vitro study was performed: Twelve human lumbar cadaveric spinal segments (six segments L2/3 and six segments L4/5) were loaded in axial compression using 600 N in the intact state and following surgical treatment using two different internal stabilisation devices. Range of motion was measured and used to calculate stiffness.
A finite element model of a human spinal segment L3/4 was loaded with the same force in intact and surgically altered state, corresponding to the situation of biomechanical in vitro study.
The results of the cadaver biomechanical and finite element analysis were compared. As they were close together, the finite element model was used to predict: (1) load-sharing within human lumbar spine in compression, (2) load-sharing within osteoporotic human lumbar spine in compression and (3) the stabilising potential of the different spinal implants with respect to bone mineral density.
A finite element model as described here may be used to predict the biomechanical behaviour of the spine. Moreover, the influence of different spinal stabilisation systems may be predicted.
HPTLC (High Performance Thin Layer Chromatography) is a well known and versatile separation method which shows a lot of advantages and options in comparison to other separation techniques. The method is fast and inexpensive and does not need time-consuming pretreatments. Using fiber-optic elements for controlled light-guiding, the TLC-method was significantly improved: the new HPTLC-system is able to measure simultaneously at different wavelengths without destroying the plate surface or the analytes on the surface. For registration of the sample distribution on a HPTLC-plate we developed a new and sturdy diode-array HPTLC- scanner which allows registration of spectra on the TLC- plates in the range of 198 nm to 610 nm with a spectral resolution better than 1.2 nm. The spatial resolution on plate is better than 160 micrometers . In the spectral mode, the new HPTLC-scanner delivers much more information than the commonly used TLC-scanner. The measurement of 450 spectra of one separation track does not need more than three minutes. However, in the fixed wavelength mode the contour plot can be measured within 15 seconds. In this case, the signal will be summarized and averaged over a spectral range having FWHM from 10 nm to 25 nm depending on the substance under test. The new diode-array HPTLC-scanner makes various chemometric applications possible. The new method can be used easily in clinical diagnostic systems easily, e.g. for blood and uring investigations. In addition, new applications are possible. For example, the rich structured PAHs were studied. Although the separation is incomplete the 16 compounds can be quantified using suitable wavelengths.
The use of a TLC scanner can be regarded as a key step in high performance thin layer chromatography (HPTLC). Densitometric measurements transform the substance distribution on a TLC plate into digital computer data. Systems that allow quantitative measurements have been available for many years for either fluorescence or ultraviolet absorption measurements, while lately the reflection analysis mode for both types is the most common application. New scanning approaches are designed to aid the analyst who has common demands for TLC-densitometry without using special data, such as scanned images. Two examples that have been developed lately in the laboratories of the authors are described in this paper. These approaches were developed on the basis of current needs for analysts who employ TLC as a tool in research, as well as in routine analysis. One approach is aimed to support analysts in economically disadvantaged areas, where cost intensive apparatus is unsuitable but trace analysis by simple means is required. The other system, allows the spectral determination of chromatographic spots on TLC plates covering the ultraviolet and visible range, thus, revealing highly desired information for the analyst.
Time Resolved Measurements of Soot Concentrations and Mean Particle Sizes during EUDC and ECE Cycles
(2002)
The bandwidth behavior of graded-index multimode fibers (GI-MMFs) for different launching conditions is investigated to understand and characterize the effect of differential mode delay. In order to reduce the launch-power distribution the near field of a single-mode fiber is used to produce a controlled restricted launch. The baseband response is measured by observing the broadening of a narrow input pulse (time-domain measurement). The paper verifies the degradation in bandwidth due to profile distortion by scanning the spot of the single-mode fiber with a transversal offset from the center of the test sample. In addition, the impact of the launch-power distribution tuned by different spot-size diameters is demonstrated. Measurements were taken on ‘older’ 50-μm and 62.5-μm GI-MMFs as well as on laser-performance-optimized fibers more recently developed.
The central purpose of this paper is to present a novel framework supporting the specification, the implementation and retrieval of media streaming services. It provides an integrated service development environment comprising of a streaming service model, a service specification language and several implementation and retrieval tools. Our approach is based on a clear separation of a streaming service specification, and its implementation by a distributed application and can be used for different streaming paradigms, e.g. push and pull services.
A Simple and Reliable HPTLC Method for the Quantification of the Intense Sweetener Sucralose®
(2003)
This paper describes a simple and fast thin layer chromatography (TLC) method for the monitoring of the relatively new intense sweetener Sucralose® in various food matrices. The method requires little or no sample preparation to isolate or concentrate the analyte. The Sucralose® extract is separated on amino‐TLC‐plates, and the analyte is derivatized “reagent‐free” by heating the developed plate for 20 min at 190°C. Spots can be measured either in the absorption or fluorescence mode. The method allows the determination of Sucralose® at the levels of interest regarding foreseen European legislation (>50 mg/kg) with excellent repeatability (RSD = 3.4%) and recovery data (95%).
Fluorescence Enhancement of Pyrene Measured by Thin-Layer Chromatography with Diode-Array Detection
(2003)
In-situ densitometry for qualitative or quantitative purposes is a key step in thin-layer chromatography. It offers a simple way of quantifying by measuring the optical density of the separated spots directly on the plate. A new TLC scanner has been developed which is able to measure TLC plates or HPTLC plates, at different wavelengths simultaneously, without destroying the plate surface. The system enables absorbance and fluorescence measurements in one run. Fluorescence measurements are possible without filters or other adjustments.
The measurement of fluorescence from a TLC plate is a versatile means of making TLC analysis more sensitive. Fluorescence measurements with the new scanner are possible without filters or special lamps. Improvement of the signal-to-noise ratio is achieved by wavelength bundling. During plate scanning the scattered light and the fluorescence are both emitted from the surface of the TLC plate and this emitted light provides the desired spectral information from substances on the TLC plate. The measurement of fluorescence spectra and absorbance spectra directly from a TLC plate is based on differential measurement of light emerging from sample-free and sample-containing zones.
The literature recommends dipping TLC plates in viscous liquids to enhance fluorescence. Measurement of the fluorescence and absorbance spectra of pyrene spots reveals the mechanism of enhancement of plate dipping in viscous liquids—blocked contact of the fluorescent molecules with the stationary phase or other sample molecules is responsible for the enhanced fluorescence at lower concentrations.
In conclusion, dipping in TLC analysis is no miracle. It is based on similar mechanisms observable in liquids. The measured TLC spectra are also very similar to liquid spectra and this makes TLC spec-troscopy an important tool in separation analysis.
Details design tools and techniques for high performance ASIC-design. Shows the best practices for creating reusable designs in an SoC design methodology.
The free convection in a vertical gap is generalized to realize new analytical solutions of the Boussinesq-equations. The steady and time-dependent solutions for the temperature and velocity distribution are discussed in detail depending on the mass flux in vertical direction. The range of existence for flows with and without back flow is obtained. The transient behaviour of the solutions during the time-dependent development displays interesting physical behaviour.
The structure of the separation bubble that appears in the secondary meridional flow between two coaxially rotating spheres at low and finite Reynolds number (Re) is considered. The low Re analytical study was motivated by recognizing some errors in the analytical work on this problem by Arunachalam and Majhi (1987, Q. Jl Mech. Appl. Math., 40, 47) whilst the finite Re experimental study was motivated by the desire to observe the separation bubble in the laboratory. Though the finite Re experiments were performed in a confined apparatus, they exhibit the qualitative features of the low Re theoretical predictions for the axisymmetric separation bubble that encloses two toroidal vortices symmetrically disposed above and below the mid‐plane of sphere separation, but strong effects of confinement are apparent. The flows observed include (i) a wall‐attached bubble symmetric about the mid‐plane at low Re, (ii) symmetric free‐standing bubbles at moderate Re, and (iii) an asymmetric bubble with flow separating from one sphere and attaching to the support shaft between the spheres at sufficiently high Re.
Bluetooth personal area networks (PANs) share the 2.4 GHz ISM spectrum with the IEEE 802.11b wireless local area networks (WLANs). With the popularity of wireless devices, this ISM spectrum is becoming more and more crowded. As a result of this interference between WLANs and PANs, the performance of each network is decreased. Current research has not significantly covered the degrading impact of an 802.11b interferer on Bluetooth voice transmission. Within this project, simulations were carried out to precisely study the impact of an 802.11b interferer on the performance of Bluetooth voice transmission at different ratio levels of Bluetooth power to WLAN power at the receiver side. Furthermore, the impact of SNR on the Bluetooth voice performance and the benefit of using the SCORT packet type was analysed as well. Based on the results presented, network performance can be evaluated at the desired activity level.
The central purpose of this paper is to present a novel framework supporting the specification and the implementation of media streaming services using XML and Java Media Framework (JMF). It provides an integrated service development environment comprising of a streaming service model, a service specification language and several implementation and retrieval tools. Our approach is based on a clear separation of a streaming service specification, and its implementation by a distributed JMF application and can be used for different streaming paradigms, e.g. push and pull services.
The need of suitable system of records in gaining ground as companies seek to maximize performance by harnessing the knowledge of their businesses, is discussed. Focused systems of record deliver a clear and consistent view even as they address a range of functions. Enterprise resource planning (ERP), as the financial system of record, embodies that view of manufacturing, inventory management, accounting and order processing. Customer relationship management (CRM), as a system of record, taps not only into the marketing and sales and service, but also into product development.
A new formula is presented for transforming fluorescence measurements in accordance with Kubelka-Munk theory. The fluorescence signals, the absorption signals, and data from a selected reference are combined in one expression. Only diode-array techniques can measure all the required data simultaneously to linearize fluorescence data correctly. To prove the new theory HPTLC quantification of the analgesic flupirtine was performed over the mass range 300 to 5000 ng per spot. The fluorescence calibration curve was linear over the whole range. The transformation of fluorescence measurements into linear mass-dependent data extends the technique of in-situ fluorescence analysis to the high concentration range. It also extends Kubelka-Munk theory from absorption to fluorescence analysis. The results presented also emphasize the importance of Kubelka-Munk theory for in-situ measurements in scattering media, especially in planar chromatography.
In this paper, we propose a new streaming media service development environment comprising of a streaming media service model, a XML based service specification language and several implementation and configuration management tools. Our approach is based on a high level streaming service specification language, which allows specifying a service in terms of media objects, QoS, and distribution policies. Driven by such a streaming service specification and a streaming component library implemented with Java Media Framework, the required distributed application infrastructure is generated automatically by a service manager. To support flexible instantiation and termination of services as well as change management during runtime, e.g. migration or substitution of streaming components, we introduce instantiation and termination rules, and reconfiguration rules.
In contrast to their traditional, non-interactive counterparts, interactive dynamic visualisations allow users to adapt their form and content to their individual cognitive skills and needs. Provided that the interactive features allow for intuitive use without increasing cognitive load, interactive videos should therefore lead to more efficient forms of learning. This notion was tested in an experimental study, where participants learned to tie four nautical knots of different complexity by watching either non-interactive or interactive videos. The results show that in the interactive condition, participants used the interactive features like stopping, replaying, reversing or changing speed to adapt the pace of the video demonstration. This led to an uneven distribution of their attention and cognitive resources across the videos, which was more pronounced for the difficult knots. Consequently users of non-interactive video presentations, needed substantially more time than users of the interactive videos to acquire the necessary skills for tying the knots.
Nowadays the processing power of mobile phones, smartphones and PDAs is increasing as well as the transmission bandwidth. Nevertheless there is still the need to reduce the content and the need of processing the data. We discuss the proposals and solutions for dynamic reduction of the transmitted content. For that, device specific properties are taken into account, as much as for the aim to reduce the need of processing power at the client side to be able to display the 3D (virtual reality) data. Therefore, well known technologies, e.g. data compression are combined with new developed ideas to reach the goal of adaptive content transmission. To achieve a device dependant reduction of processing power the data have to be preprocessed at the server side or the server even has to take over functionality of weak mobile devices.
Experimental and theoretical investigations of the time of equalization of the concentration of an impurity in a rectangular flow‐type chamber have been carried out. It has been shown that the process of equalization of the concentration with time is exponential in character. The characteristic equalization time has been computed using the theory of turbulent diffusion. Theoretical results describe experimental regularities with an accuracy of about 10%. The value of the coefficient of turbulent diffusion for different configurations of flows in the chamber has been obtained from a comparison of experimental and calculated results.
Lattice vibrations and electronic transitions in the rare-earth metals: Praseodymium under pressure
(2004)
Praseodymium was investigated by Raman spectroscopy under pressure. A negative pressure shift of the E2g mode is observed in the dhcp phase, which indicates that the initial structural sequence hcp→Sm−type→dhcp→fcc as a whole in the regular lanthanides is associated with a softening of this mode. The pressure response of the phonon modes, observed in the monoclinic and α-uranium phases, where 4f bonding becomes important, is characteristic for anisotropic bonding properties.
The goal of eLearning services integration in manufacturing is, through the development of new multimedia solutions, to accelerate and enhance the ability of the manufacturing industry to capitalise on the emergence of a powerful global information infrastructure. The key components of our approach are: (1) an XML based streaming service specification language; (2) automatic configuration of distributed eLearning streaming service implementations; (3) a set of Web services for searching, registration, and creation of streaming services; (4) caching and replication policies based on timing information derived from the service specifications. We also introduce a new concept for cache management during runtime, e.g., content is distributed to cache servers located at the edge of a network close to the client.
Thin-layer chromatography (TLC) is a well-established and widely used separation technique. Most undergraduate students of chemistry or food science used TLC as a primitive separation tool, which does not need more than small pieces of TLC plates, a glass jar and some solvents. TLC has evolved from a simple separation method of the past into an instrumental technique that offers automation, reproducibility and accurate quantification for a wide variety of applications [1]. The use of modern 10*10 cm TLC plates with narrow particle size distribution is called high performance thin layer chromatography (HPTLC), to distinguish the method from the use of traditional 20 20 cm TLC plates.
Diode-array planar chromatography is a versatile tool for identification of pharmaceutical substances In this paper thirty-three compounds with benzodiazepine properties were investigated and the separating conditions for silica gel HPTLC plates and three mobile phases were optimized. Diode-array HPTLC makes it possible to identify all the compounds with high certainty down to a level of 20 ng. An algorithm for spectral recognition which is combined with R F values from the three separation steps into one fit factor is presented. This set of data is unique for each of the compounds investigated and enables unequivocal identification. The method is rapid, inexpensive, and sensitive down to a level of 20 ng mL −1.
The iSign project started in 2000 as a web-based laboratory setting for students of electrical engineering. In the meantime it has broadened into a heterogeneous learning environment offering learning material, adaptive user settings and access to a simulation tool. All these offerings can be accessed via web and wireless by different clients, such as PCs, PDAs and mobile phones. User adaptive systems offer unique and personalised environment for every learner and therefore are a very important aspect of modern e-learning systems. The iSign project aims to personalise the content structure based on the learner's behaviour, content pattern, policies, and system environment. The second aspect of the recent research and development within this project is the generation of suitable content and presentation for different clients. This generation is based additionally on the user preferences in order to obtain the desirable presentation for a given device. New, valuable features are added to the mobile application, empowering the user not only to control the simulation process with his mobile device but also to input data, view the simulation's output and evaluate the results. Experiences with students have helped to improve functionality and look-and-feel whilst using the iSign system. Our goal is to provide unconstrained, continuous and personalised access to the laboratory settings and learning material everywhere and at anytime with different devices.
Specific prototypes of sedimentation field flow fractionation devices (SdFFF) have been developed with relative success for cell sorting. However, no data are available to compare these apparatus with commercial ones. In order to compare with other devices mainly used for non-biological species, biocompatible systems were used for standard particle (latex: 3–10 μm of different size dispersities) separation development. In order to enhance size dependent separations, channels of reduced thickness were used (80 and 100 μm) and channel/carrier-phase equilibration procedures were necessary. For sample injection, the use of inlet tubing linked to the FFF accumulation wall, common for cell sorting, can be extended to latex species when they are eluted in the Steric Hyperlayer elution mode. It avoids any primary relaxation steps (stop flow injection procedure) simplifying series of elution processing. Mixtures composed of four different monodispersed latex beads can be eluted in 6 min with 100 μm channel thickness.
The three wavelength extinction method (3-WEM) was applied for the on-line particle analysisof suspensions of monodisperse latex beads and polydisperse metal oxide particles of industrialinterest. Comparative measurements were performed by photon correlation spectroscopy (PCS). Thedata of latex particles obtained by 3-WEM and PCS are in good agreement with the manufacture’svalues. Also, the values of oxide particles measured by means of the two techniques are in reasonableagreement despite of the irregular particle shape.Discrepancies are observed by comparing the oxideparticle size results with those of scanning electron microscopy, which is due to the broad sampledistributions and shape irregularities.
Alexander von Humboldt’s maps, graphs and illustrations contain a great deal of detail, but in the available rare editions they are hardly visible to the naked eye. In many editions they have been reduced. In a digital library, they will become accessible in their entirety, and Internet technology will reproduce them in a form that overcomes the limitations of the original printing. The user will be able to enlarge the images and see details that might have been overlooked in the past. The Humboldt’s digital library will adhere to the standards for digital libraries established by the Open Archives Initiative (OAI) and the tools EPRINTS and DSPACE to provide the Web services and determine the most effective way to establish dynamic linking and knowledge based searching of information within the archive.
In this paper, a new method is demonstrated for online remote simulation of photovoltaic systems. The required communication technology for the data exchange is introduced and the methods of PV generator parameter extraction for the simulation models are analysed. The method shown for parameter extraction from the manufacturer data is especially useful for the commissioning procedure, where the measured installed power is transferred to standard test conditions using the simulation model and can then be easily compared with the design power. At a simulation accuracy of 2% using the software environment INSEL ® any problems with the PV generator can reliably be detected. Online simulation of a grid connected PV generator is then carried out during the operation of the photovoltaic plant. The visualisation includes both the monitored and the simulated online data sets, so that a very efficient fault detection scheme is available. The method is implemented and validated on several grid connected photovoltaic power plants in Germany. It is excellently suited to provide automatic and real time fault detection and significantly improve the commissioning procedure for photovoltaic plants of all sizes.
This paper explores the potential of an m-learning environment by introducing the concept of mLab, a remote laboratory environment accessible through the use of handheld devices.
We are aiming to enhance the existing e-learning platform and internet-assisted laboratory settings, where students are offered in-depth tutoring, by providing compact tuition and tools for controlling simulations that are made available to learners via handheld devices. In this way, students are empowered by having access totheir simulations from any place and at any time.
Sustainable Aspects force a building manager to continuous observation of actual states and developments concerning building use, energy and media flows.In the presented approach a communication structure was built up to use different software applications and tools in order to optimize the operation of the building.
We propose a new streaming media service development environment comprising of a streaming media service model, a XML based service specification language and several implementation and configuration management tools. In our project, the described approach is used for integration of streaming based eLearning services in manufacturing processes of a subcontractor to the automotive industry. The key components of our approach are 1) an xml based streaming service specification language, 2) a set of web services for searching, registration, and creation of streaming services, 3) caching and replication policies based on timing information derived from the service specifications.
The establishment of a software tool chain among requirements management tools, black box test approach tool CTE XL and RTRT is proposed in this paper. The use of Classification Tree Method ensures the reduction in the number of test cases and promises an increased efficiency when testing. The traceability of test cases and requirements is guaranteed by the established software tool chain with well defined interfaces. As the experimental results point out, a better test coverage can be achieved. Future work can be based on automatic generation of init and expected values for testing, requiring no interference from a software quality engineer. In conclusion, the tasks that need to be performed by the software quality engineers is to define the black box test cases using CTM/CTE XL, import the requirements from the requirements management tools, import the XML file to test tool RTRT. By giving the initial and expected values the testing can be performed in a comfortable way.
In thin-layer chromatography the development step distributes the sample throughout the layer, a process which strongly affects the reflection signals. The essential requirement for quantitative thinlayer chromatography is not a constant sample concentration but constant sample distribution in each sample spot. This makes evaporation of the mobile phase extremely important, because all tracks of a TLC plate must be dried uniformly. This paper shows that quantitative TLC is possible even if the concentration of the sample is not constant throughout the layer or if the distribution of the sample is not known. With uniform sample distribution, classical Kubelka-Munk theory is valid for isotropic scattering only. In the absence of this constraint classical Kubelka-Munk theory must be extended to situations where scattering is asymmetric. This can be achieved by modification of the original Kubelka-Munk equation. Extended theory is presented which is not only capable of describing asymmetrical scattering in TLC layers but also includes a formula for absorption and fluorescence in diode-array TLC. With this new theory all different formulas for diode-array thin-layer chromatographic evaluation are combined in one expression.
In 2000 the iSign project started as a virtual web-based laboratory for students of study program electrical engineering. Continuous development in the last years led to a heterogeneous learning environment offering learning material, adaptive user settings and access to a simulation tool. Access is available via web and wireless devices such as PCs, Laptops, PDAs, smartphones and mobile phones. Our attempt to adapt the content to the user's needs and the currently used device led us to a XML based data structure. This report shows our research results about content adaptation based on XML data. The two main aspects for that process are: the device capabilities and the adaptation methods using XML data.
Nowadays the processing power of mobile phones, Smart phones and PDA is increasing, as well as the transmission bandwidth. Nevertheless there is still the need to reduce the content and the need of processing the data. Proposals and solutions for dynamic reduction of the transmitted content will be discussed. For that, device specific properties will be taken into account, aiming at reducing the need of processing power at the client side to display the 3D Virtual Reality data. Therefore, well known technologies like data compression are combined with new approaches to achieve the goal of adaptive content transmission. For device dependant reduction of processing power the data has to be pre-processed at the server side or the server itself has to take over functionality of weak mobile devices.
In thin-layer chromatography, fiber-bundle arrays have been introduced for spectral absorption measurements in the UV-region. Using all-silica fiber bundles, the exciting light will be detected after re-emission on the plate with a fiberoptic spectrometer. In addition, fluorescence light can be detected which will be masked by the re-emitted light. Therefore, it is helpful to separate the absorption and fluorescence on the TLC-plate. A modified three-array assembly has been developed: using one array for detection, the two others are used for excitation with broadband band deuterium-light and with UV-LEDs adjusted to the substances under test. As an example, the quantification of glucosamine in nutritional supplements or spinach leaf extract will be described. Using simply heating of the amino-plate for derivation, the reaction product of Glucosamine can be detected sensitively either by light absorption or by fluorescence, using the new fiber-optic assembly. In addition, the properties of the new 3-row fiber-optic array and the commercially available UV-LEDs will be shown, in the interesting wavelength region for excitation of fluorescence, from 260 nm to 360 nm. The squint angle having an influence on coupling efficiency and spatial resolution will be measured with the inverse farfield method. Some properties of UV-LEDs for analytical applications will be described and discussed, too.
Previous studies of the hyphenation of gas chromatographic separation and spectrophotometric detection in the ultraviolet wavelength range between 168 and 330 nm showed a high potential for applications where the analysis of complex samples is required. Within this paper the development of a state-of-the-art detection system for compounds in the vapour phase is described, offering an improved behaviour compared to previous systems: Dependent on the requirements of established detection systems hyphenated with gas chromatography, the main components of the system have to be designed for optimum performance and reliability of the spectrophotometric detector: A deuterium lamp as a broadband light source has been selected for improved stability in the measurements. A new-type absorption cell based on fiber-optics has been developed considering the dynamic necessary to compete with existing techniques. In addition, the influence of the volume of the cell on the chromatogram needs to be analyzed. Tests for determining the performance of the absorption cell in terms of chemical and thermal influences have been carried out. A new spectrophotometer with adequate spectral resolution in the wavelength range, offering improved stability and dynamic for an efficient use in this application was developed. Furthermore, the influence of each component on the performance, reliability and stability of the sensor system will be discussed. An overview and outlook over the potential applications in the environmental, scientific and medical field will be given.
Non-esterified plant oils gain ecological and economical importance, particularly in the EU where it is intended to increase the share of renewable energies. Plant oils do not require any chemical treatment so do not cause secondary pollution. The importance of plant oil will increase in Germany for mobile and stationary applications. The generation co-generation of heat and power is subsidized by the German “Erneuerbares Energiegesetz” and the “Kraft-Wärme-Kopplungsgesetz” when renewable fuels are used such as plant oils..
Plant oils have a much higher viscosity than conventional gas oil. It is mandatory to decrease the oil viscosity by heating prior to injection to assure proper injection and to avoid engine damage due to coke formation in the combustion chamber and at the injection nozzle. The German quality standard of Weihenstephan (RK-Qualitätsstandard 05/2000) for rape seed oil should be followed for use as diesel fuel. The chemical composition of plant oils is appreciably different in comparison to diesel fuels derived from mineral oils suggesting also different emission behavior.
Particle and Gaseous Emissions of Diesel Engines Fuelled by Different Non-Esterified Plant Oils
(2007)
The particulate matter and gas emissions of several plant oils are analyzed in the hot exhaust gas under various engine conditions at different speeds and loads The measurement data are compared to the emission values of conventional diesel fuel (gas oil). The investigation concentrates on a modern common rail TDI light duty diesel, four cylinders, for passenger cars. The differences in the gas and particulate matter emission - compared to conventional diesel fuel - are remarkably low for the diesel engine which is properly adjusted for the plant oils. Emission data of an old heavy duty diesel engine are also shown for comparison reasons and reveals large differences. Differences are found in the pressures of the indicator diagram, time resolved over the crank angle. Plant oils consistently exhibit a higher cylinder pressure. The TEM investigation confirms the differences found by the LPME (long path multi-wavelength extinction) on-line analysis.
Electronic pills, smart capsules or miniaturized microsystems swallowed by human beings or animals for various biomedical and diagnostic applications are growing rapidly in the last years. This paper searched out the important existing electronic pills in the market and prototypes in research centers. Further objective of this research is to develop a technology platform with enhanced feature to cover the drawback of most
capsules. The designed telemetry unit is a synchronous bidirectional communication block using continuous phase DQPSK of 115 kHz low carrier frequency for inductive data transmission suited for human body energy transfer. The communication system can assist the electronic pill to trigger an actuator for drug delivery, to record temperature, or to measure pH of the body. It consists additionally to a 32bit processor, memory, external peripheries, and detection facility. The complete system is designed to fit small-size mass medical application with low power consumption, size of 7x25mm. The system is designed, simulated and emulated on FPGA.
A platform of an electronic capsule is being developed for multi-task medical assistant application. It includes a near field telemetry unit for bidirectional communication system of 115 KHz low carrier frequency for inductive data transmission suited for human body energy transfer. The system triggers an actuator for drug delivery in various time and release forms via wireless external control, it has the ability to record temperature, measure pH of the body (additional sensors), and retrieve data to the outside. It consists of a 32bit processor, memory, external peripheries, and detection facility. The complete system is designed to fit small-size mass medical application with low power consumption, size of 7x25mm. The system is designed, simulated and emulated on FPGA. A final layout of the complete chip design is still under progress.
A new electronic capsule with bidirectional communication system is being developed for multi-task application. The capsule is designed to be a platform for medical assistant application inside the body. The designed telemetry unit is a synchronous bidirectional communication block using continuous phase DQPSK of 115 kHz low carrier frequency for inductive data transmission suited for human body energy transfer. The communication system can assist the electronic pill to trigger an actuator for drug delivery, to record temperature, or to measure pH of the body. It consists additionally to a 32bit processor, memory, external peripheries, and detection facility. The complete system is designed to fit small-size mass medical application with low power consumption, size of 7x25mm. The system is designed, simulated and emulated on FPGA. A final layout of the complete chip design is still under progress.
A new, small, and optimized for low power processor core named SIRIUS has been developed, simulated, synthesized to a netlist and verified. From this netlist, containing only primitives like gates and flip-flops, a mapping to an ASIC - or FPGA technology can easily be done with existing synthesizer tools, allowing very complex SOC designs with several blocks. Emulation via FPGA can be done on already simple setups and cheap hardware because of the small core size. The performance is estimated 50 MIPS on Cyclone II FPGA and about 100 MIPS on a 0.35 CMOS 5M2P technology with 4197 primitives used for the core, including a 16 x 16 multiplier. An example design of the ASIC for an electronic ePille device currently in development is shown.
This paper focuses on the effects of differential mode delay (DMD) on the bandwidth of multimode optical fibres. First an analytical solution for the computation of the differential mode time delay is presented. The electrical field of each mode is calculated by the numerical solution of the Helmholtz equation. Based on this solution the modal power distribution as well as the fibre's impulse response under different launching conditions can be obtained.
Next, the refractive-index profile of two fibres is modelled on the basis of DMD measurements. It is shown that these measurements provide enough information to predict the fibre's propagation characteristics under different launch conditions (excitation conditions).
To provide proper solutions to the problem of device dependant content delivery, a fine categorization of the application target devices is needed. Earlier attempts provided two different presentations for desktop and mobile platforms. The mobile platform presentation was divided into three categories, based on a general classification (PDA, Smartphone or mobile phone). In order to improve the on mobile device presentation a finer categorization is introduced. In this paper, our focus is to clarify the concept of this more flexible presentation module, in which the delivered content depends on the efficiency of the device based on a selected set of capabilities.
In-situ densitometry for qualitative or quantitative purposes is a key step in thin-layer chromatography (TLC). It is a simple means of quantification by measurement of the optical density of the separated spots directly on the plate. A new scanner has been developed which is capable of measuring TLC or HPTLC (high-performance thin-layer chromatography) plates simultaneously at different wavelengths without damaging the plate surface. Fiber optics and special fiber interfaces are used in combination with a diode-array detector. With this new scanner sophisticated plate evaluation is now possible, which enables use of chemometric methods in HPTLC. Different regression models have been introduced which enable appropriate evaluation of all analytical questions. Fluorescent measurements are possible without filters or special lamps and signal-to-noise ratios can be improved by wavelength bundling. Because of the richly structured spectra obtained from PAH, diode-array HPTLC enables quantification of all 16 EPA PAH on one track. Although the separation is incomplete all 16 compounds can be quantified by use of suitable wavelengths. All these aspects are enable substantial improvement of in-situ quantitative densitometric analysis.
Design of next-generation cdma using orthogonal complementary codes and offset stacked spreading
(2007)
This article presents an innovative code-division multiple access system architecture that is based on orthogonal complementary spreading codes and time-frequency domain spreading. The architecture has several advantages compared to conventional CDMA systems. Specifically, it offers multiple-access-interference-free operation in AWGN channels, reduces co-channel interference significantly, and has the potential for higher capacity and spectral efficiency than conventional CDMA systems. This is accomplished by using an "offset stacked" spreading modulation technique followed by quadrature amplitude modulation, which optimizes performance in a fading environment. This new spreading modulation scheme also simplifies the rate matching algorithms relevant for multimedia services and IP-based applications.
The identification and quantification of compounds in the gas phase becomes of increasing interest in the context of environmental protection, as well as in the analytical field. In this respect, the high extinction coefficients of vapours and gases in the ultraviolet wavelength region allow a very sensitive measurement system. In addition, the increased performance of the components necessary for setting up a measurement system, such as fibres, light sources and detectors has been improved. In particular the light sources and detectors offer improved stability, and the deep UV performance and solarisation resistance of fused silica fibres allow have been significantly optimized in the past years. Therefore a compact and reliable detection system with high measuring accuracy is developed. Within this paper possible applications of the system under development and recent results will be discussed.
Soot particles emitted from a light duty (LD) Volkswagen diesel engine running at different operating points (speed and torque levels) are analyzed for mean size determination using a laser‐based three Wavelength Extinction Method (3‐WEM). For this reason, collected soot samples are suspended using an appropriate sample preparation technique with optimized conditions of sonication as it revealed its effect on the soot mean particle size measured by 3‐WEM.
An online Scanning Mobility Particle Analyzer (SMPS) is also used to measure soot emission at identical engine operating points. Size values obtained from SMPS are lower than those of suspended soot samples obtained from 3‐WEM. The size discrepancies are mainly related to the required sample preparation procedure employed for 3‐WEM measurements. The engine operating points affect, differently, the size measurements obtained from SMPS and 3‐WEM.
Sedimentation Field‐Flow Fractionation (SdFFF) is used for density determination of soot samples based on size measurements of fractions collected at peak maxima of fractograms using the off‐line hyphenation with 3‐WEM. It is assumed that a size dependent separation of soot particles occurred with a uniform particle density over the whole size distribution. An average density value is used for the conversion of soot fractograms to size distributions. Discrepancies are also found with size distribution profiles obtained from SMPS for the same engine operating points, due to the sample preparation procedure employed for SdFFF measurements.
This paper presents a multicarrier code-division multiple-access (CDMA) system architecture that is based on complete complementary orthogonal spreading codes. The architecture has several advantages as compared to conventional CDMA systems. Specifically, it offers multiple-access interference-free operation in additive white Gaussian noise channels, reduces cochannel interference significantly, and has the potential of higher capacity and spectral efficiency than conventional CDMA systems. This is accomplished by using an ldquooffset stackedrdquo spreading modulation technique. To maintain good performance in the presence of fading, the offset stacked modulator is followed by a quadrature-amplitude modulation map, which is designed to optimize performance in a fading environment. This new modulation scheme also simplifies the rate-matching algorithms that are relevant for multimedia services and Internet Protocol-based applications.
In this paper a practical way for fatigue life prediction of rubber products under multiaxial loads is shown. This is done by means of fracture mechanical concepts and the energy release rate as the failure criterion. Due to a FEA post-processor the potential energy release rate might be calculated at every material point supposed there was a crack. And therefore the risk of failure and with the help of a strain number curve the time to fatigue is able to be calculated by FEA. This concept is applied for an estimation of the life time of a test specimen with tensile loading from fatigue data of a shear loaded specimen of different design. This rather more theoretical concept of the energy release rate is complemented by experimental crack growth data by a Tear Fatigue Analyzer with its great advantage of reduction of testing time and costs compared to those of fatigue tests. For some materials a thorough characterization of crack growth and fatigue behavior is presented and is applied to estimate the time to fatigue by FEA for a real component under multiaxial loads.
Experimental and numerical investigations into the forming of tailored strips and tailored tubes
(2008)
Through the application of tailored strips and tailored tubes, the wall thickness of components can be manufactured in a load-optimised manner. Thus, it is also possible to optimise component weight. Prior to the application of tailored products, wall thicknesses and the respective degree of deformation as well as the welding seam position can be determined in a FEM (finite element method) simulation. These results are then verified in test series on transfer presses and tube bending machines, with the necessary tool adaptations being determined in the process. This results in weight and cost reductions for deep-drawn components and tube sections. Moreover, this means that especially with regard to tubes, multiple sections can be combined in one component. A feasibility study shows that the level of possible weight and cost savings depends on the respective component geometry and load situation. Additional costs for the production of tailored products and - if necessary - tool modifications also need to be considered. Thus, the amount of savings possible for a part can only be determined on an individual basis.
The authors present an abiotically catalyzed glucose fuel cell and demonstrate its application as energy harvesting power source for a cardiac pacemaker. This is enabled by an optimized DC-DC converter operating at 40 % conversion efficiency, which surpasses commercial low-power DC-DC converters. The required fuel cell surface area can thus be reduced from about 125 cm2 to 18 cm2, which would allow for its direct integration onto the pacemaker casing.
A new miniaturized capsule with 32bit processor and bidirectional communication system is being developed for multitask application. The capsule is designed to be a platform for medical assistant application inside the body. The processor core SIRIUS has been developed, simulated, synthesized to a netlist and verified. The designed telemetry unit is a synchronous bidirectional communication block using continuous phase DQPSK of 115 kHz low carrier frequency for inductive data transmission suited for human body energy transfer. The communication system can assist the electronic pill to trigger an actuator for drug delivery, to record temperature, or to measure pH of the body. The complete system is designed to fit small-size mass medical application with low power consumption, size of 7x25 mm. The system is designed, simulated, emulated on FPGA, and routed in AMIS Technology.
Im Rahmen einer Master Thesis wurde ausgehend von einem vorhandenen System On Chip Design, welches eingehende EKG-Datensignale verarbeitet, das bestehende System so erweitert dass es komplett über den standardisierten SPI-Bus steuerbar und auslesbar ist.