Refine
Document Type
- Article (reviewed) (8)
- Conference Proceeding (8)
- Patent (4)
- Contribution to a Periodical (2)
- Letter to Editor (2)
- Article (unreviewed) (2)
- Moving Images (1)
Conference Type
- Konferenzartikel (5)
- Konferenz-Abstract (2)
- Konferenz-Poster (1)
Is part of the Bibliography
- yes (27)
Keywords
- Augmented Reality (3)
- Kamera (3)
- prosthesis (2)
- visual control (2)
- 3D-print (1)
- AR (1)
- AR glasses (1)
- Adaptive learning (1)
- Adaptive resonance theory (1)
- Augmented reality (1)
Institute
- Fakultät Elektrotechnik und Informationstechnik (E+I) (bis 03/2019) (15)
- Fakultät Elektrotechnik, Medizintechnik und Informatik (EMI) (ab 04/2019) (11)
- POIM - Peter Osypka Institute of Medical Engineering (2)
- Fakultät Maschinenbau und Verfahrenstechnik (M+V) (1)
- IUAS - Institute for Unmanned Aerial Systems (1)
Open Access
- Open Access (12)
- Closed Access (9)
- Hybrid (1)
In the field of neuroprosthetics, the current state-of-the-art method involves controlling the prosthesis with electromyography (EMG) or electrooculography/electroencephalography (EOG/EEG). However, these systems are both expensive and time consuming to calibrate, susceptible to interference, and require a lengthy learning phase by the patient. Therefore, it is an open challenge to design more robust systems that are suitable for everyday use and meet the needs of patients. In this paper, we present a new concept of complete visual control for a prosthesis, an exoskeleton or another end effector using augmented reality (AR) glasses presented for the first time in a proof-of-concept study. By using AR glasses equipped with a monocular camera, a marker attached to the prosthesis is tracked. Minimal relative movements of the head with respect to the prosthesis are registered by tracking and used for control. Two possible control mechanisms including visual feedback are presented and implemented for both a motorized hand orthosis and a motorized hand prosthesis. Since the grasping process is mainly controlled by vision, the proposed approach appears to be natural and intuitive.
A new concept for robust non-invasive optical activation of motorized hand prostheses by simple and non-contactcommands is presented. In addition, a novel approach for aiding hand amputees is shown, outlining significantprogress in thinking worth testing. In this, personalized 3D-printed artificial flexible hands are combined withcommercially available motorized exoskeletons, as they are used e.g. in tetraplegics.
Purpose
This work presents a new monocular peer-to-peer tracking concept overcoming the distinction between tracking tools and tracked tools for optical navigation systems. A marker model concept based on marker triplets combined with a fast and robust algorithm for assigning image feature points to the corresponding markers of the tracker is introduced. Also included is a new and fast algorithm for pose estimation.
Methods
A peer-to-peer tracker consists of seven markers, which can be tracked by other peers, and one camera which is used to track the position and orientation of other peers. The special marker layout enables a fast and robust algorithm for assigning image feature points to the correct markers. The iterative pose estimation algorithm is based on point-to-line matching with Lagrange–Newton optimization and does not rely on initial guesses. Uniformly distributed quaternions in 4D (the vertices of a hexacosichora) are used as starting points and always provide the global minimum.
Results
Experiments have shown that the marker assignment algorithm robustly assigns image feature points to the correct markers even under challenging conditions. The pose estimation algorithm works fast, robustly and always finds the correct pose of the trackers. Image processing, marker assignment, and pose estimation for two trackers are handled in less than 18 ms on an Intel i7-6700 desktop computer at 3.4 GHz.
Conclusion
The new peer-to-peer tracking concept is a valuable approach to a decentralized navigation system that offers more freedom in the operating room while providing accurate, fast, and robust results.
Die Erfindung betrifft ein Verfahren zur Steuerung eines Geräts, insbesondere einer Handprothese oder eines Roboterarms, wobei wenigstens ein an oder im Bezug zu dem Gerät positionierter Marker von einer an einer Bedienperson angeordneten Kamera erkannt wird, wobei ab dem Erkennen des wenigstens einen Markers eine vordefinierte Bewegung der Bedienperson zusammen mit der Kamera erkannt wird und zum Auslösen einer entsprechenden Aktion des Geräts verwendet wird, wobei die vordefinierte Bewegung einer Bedienperson in Form eines Sehstrahls mittels Kamera-Tracking erkannt wird. Weiterhin betrifft die Erfindung eine Anordnung aus einem Gerät, insbesondere einer Handprothese oder eines Roboterarms, und einer AR-Brille zur Durchführung eines derartigen Verfahrens.
Restoring hand motion to people experiencing amputation, paralysis, and stroke is a critical area of research and development. While electrode-based systems that use input from the brain or muscle have proven successful, these systems tend to be expensive and di¨cult to learn. One group of researchers is exploring the use of augmented reality (AR) as a new way of controlling hand prostheses. A camera mounted on eyeglasses tracks LEDs on a prosthetic to execute opening and closing commands using one of two different AR systems. One system uses a rectangular command window to control motion: crossing horizontally signals “open” along one direction and “close” in the opposite direction. The second system uses a circular command window: once control is enabled, gripping strength can be controlled by the direction of head motion. While the visual system remains to be tested with patients, its low cost, ease of use, and lack of electrodes make the device a promising solution for restoring hand motion.
Die Erfindung betrifft ein Verfahren zur Kalibration einer Kamera (110) unter Nutzung eines Bildschirmes (120), wobei der Bildschirm (120) eine Menge von Bildpunkten (122) aufweist und die Kamera (110) eine Vielzahl von Pixeln (112) zur Darstellung des Bildes nutzt. Das Verfahren umfasst die folgenden Schritten (a) Darstellen zumindest eines Bildwertes (BW) in zumindest einem Bildpunkt (122) des Bildschirms (120) basierend auf einer Bildwertzuweisung; (b) Erfassen des zumindest einen Bildwertes (BW) durch einen Pixel (112a) der Kamera (110); und (c) Bestimmen der Position des zumindest einen Bildpunktes (122) auf dem Bildschirm (120) basierend auf dem zumindest einen erfassten Bildwert (BW) und der Bildwertzuweisung.
Method for controlling a device, in particular, a prosthetic hand or a robotic arm (US20200327705A1)
(2020)
A method for controlling a device, in particular a prosthetic hand or a robotic arm, includes using an operator-mounted camera to detect at least one marker positioned on or in relation to the device. Starting from the detection of the at least one marker, a predefined movement of the operator together with the camera is detected and is used to trigger a corresponding action of the device. The predefined movement of the operator is detected in the form of a line of sight by means of camera tracking. A system for controlling a device, in particular a prosthetic hand or a robotic arm, includes a pair of AR glasses adapted to detect the at least one marker and to detect the predefined movement of the operator.
Das hier vorgestellte System verbindet das neue Konzept der Peer-to-Peer-Navigation mit dem Einsatz von Augmented Reality zur Unterstützung von bettseitig durchgeführten externen Ventrikeldrainagen. Das sehr kompakte und genaue Gesamtsystem beinhaltet einen Patiententracker mit integrierter Kamera, eine Augmented-Reality-Brille mit Kamera und eine Punktionsnadel bzw. einen Pointer mit zwei Trackern, mit dessen Hilfe die Anatomie des Patienten aufgenommen wird. Die exakte Position und Richtung der Punktionsnadel wird unter Zuhilfenahme der aufgenommenen Landmarken berechnet und über die Augmented-Reality-Brille für den Chirurgen sichtbar auf dem Patienten dargestellt. Die Methode zur Kalibrierung der statischen Transformationen zwischen Patiententracker und daran befestigter Kamera beziehungsweise zwischen den Trackern der Punktionsnadel sind für die Genauigkeit sehr wichtig und werden hier vorgestellt. Das Gesamtsystem konnte in vitro erfolgreich getestet werden und bestätigt den Nutzen eines Peer-to-Peer-Navigationssystems.