Refine
Document Type
- Conference Proceeding (13) (remove)
Conference Type
- Konferenzartikel (13)
Has Fulltext
- no (13)
Is part of the Bibliography
- yes (13)
Keywords
- Human-Robot Collaboration (2)
- Robotics (2)
- 3D Printed Force Sensor (1)
- 3D printing (1)
- 3D-Time-of-Flight Cameras (1)
- Cameras (1)
- Capacitive Liquid Level Sensor (1)
- CoBot (1)
- Cobotik (1)
- Collaboration (1)
Institute
Open Access
- Closed Access (8)
- Closed (5)
Differentiation between human and non-human objects can increase efficiency of human-robot collaborative applications. This paper proposes to use convolutional neural networks for classifying objects in robotic applications. The body temperature of human beings is used to classify humans and to estimate the distance to the sensor. Using image classification with convolutional neural networks it is possible to detect humans in the surroundings of a robot up to five meters distance with low-cost and low-weight thermal cameras. Using transfer learning technique we trained the GoogLeNet and MobilenetV2. Results show accuracies of 99.48 % and 99.06 % respectively.
The development of a 3D printed force sensor for a gripper was studied applying an embedded constantan wire as sensing element. In the first section, the state of the art is explained. In the main section of the paper the modeling, simulation and verification of a sensor element are described for a three-point bending test made in accordance with the DIN EN ISO 178. The 3D printing process of the Fused Filament Fabrication (FFF) utilized for manufacturing the sensor samples in combination with an industrial robot are shown. A comparison between theory and practice are considered in detail. Finally, an outlook is given regarding the integration of the sensor element in gripper jaws.
This paper presents the development of a capacitive level sensor for robotics applications, which is designed for measurements of liquid levels during a pouring process. The proposed sensor design applies the advantages of guard electrodes in combination with passive shielding to increase resistance against external influences. This is important for reliable operations in rapidly changing measurement environments, as they occur in the field of robotics. The non-contact type sensor for liquid level measurement is the solution for avoiding contaminations and suit food guidelines. The designed sensor can be utilized in gastronomic applications. Two versions of the sensor were simulated, fabricated, and compared. The first version is based on copper electrodes, and the other type is fully 3D printed with electrodes made of conductive polylactic acid (PLA).
3D printing offers customisation capabilities regarding suspensions for oscillators of vibration energy harvesters. Adjusting printing parameters or geometry allows to influence dynamic properties like resonance frequency or bandwidth of the oscillator. This paper presents simulation results and measurements for a spiral shaped suspension printed with polylactic acid (PLA) and different layer heights. Eigenfrequencies have been simulated and measured and damping ratios have been experimentally determined.
Separation Estimation with Thermal Cameras for Separation Monitoring in Human-Robot Collaboration
(2022)
Human-Robot Collaborative applications have the drawback of being less efficient than their non-collaborative counterparts. One of the main reasons is, that the robot has to slow down when a human being is within the operating space of the robot. There are different approaches on dynamic speed and separation monitoring in human-robot collaborative applications. One approach additionally differentiates between human and non-human objects to increase efficiency in speed and separation monitoring. This paper proposes to estimate the separation distance by measuring the temperature of the approaching object. Measurements show that the measured temperature of a human being decreases with 1 deg C per meter distance from the sensor. This allows an estimation of separation between a robotic system and a human being.
Efficient collaborative robotic applications need a combination of speed and separation monitoring, and power and force limiting operations. While most collaborative robots have built-in sensors for power and force limiting operations, there are none with built-in sensor systems for speed and separation monitoring. This paper proposes a system for speed and separation monitoring directly from the gripper of the robot. It can monitor separation distances of up to three meters. We used single-pixel Time-of-Flight sensors to measure the separation distance between the gripper and the next obstacle perpendicular to it. This is the first system capable of measuring separation distances of up to three meters directly from the robot's gripper.
Avoiding collisions between a robot arm and any obstacle in its path is essential to human-robot collaboration. Multiple systems are available that can detect obstacles in the robot's way prior and subsequent to a collision. The systems work well in different areas surrounding the robot. One area that is difficult to handle is the area that is hidden by the robot arm. This paper focuses on pick and place maneuvers, especially on obstacle detection in between the robot arm and the table that robot is located on. It introduces the use of single pixel time-of-flight sensors to detect obstacles directly from the robot arm. The proposed approach reduces the complexity of the problem by locking axes of the robot that are not needed for the pick and place movement. The comparison of simulated results and laboratory measurements show concordance.
Human-robot collaboration plays a strong role in industrial production processes. The ISO/TS 15066 defines four different methods of collaboration between humans and robots. So far, there was no robotic system available that incorporates all four collaboration methods at once. Especially for the speed and separation monitoring, there was no sensor system available that can easily be attached directly to an off-the-shelf industrial robot arm and that is capable of detecting obstacles in distances from a few millimeters up to five meters. This paper presented first results of using a 3D time-of-flight camera directly on an industrial robot arm for obstacle detection in human-robot collaboration. We attached a Visionary-T camera from SICK to the flange of a KUKA LBR iiwa 7 R800. With Matlab, we evaluated the pictures and found that it works very well for detecting obstacles in a distance range starting from 0.5 m and up to 5 m.
In safety critical applications wireless technologies are not widely spread. This is mainly due to reliability and latency requirements. In this paper a new wireless architecture is presented which will allow for customizing the latency and reliability for every single participant within the network. The architecture allows for building up a network of inhomogeneous participants with different reliability and latency requirements. The used TDMA scheme with TDD as duplex method is acting gentle on resources. Therefore participants with different processing and energy resources are able to participate.
In medical applications wireless technologies are not widely spread. Today they are mainly used in non latency-critical applications where reliability can be guaranteed through retransmission protocols and error correction mechanisms. By using retransmission protocols within the disturbed shared wireless channel latency will increase. Therefore retransmission protocols are not sufficient for removing latency-critical wired connections within operating rooms such as foot switches. Todays research aims to improve reliability through the physical characteristics of the wireless channel by using diversity methods and more robust modulation. In this paper an Architecture for building up a reliable network is presented. The Architecture offers the possibility for devices with different reliability, latency and energy consumption requirements to participate. Furthermore reliability, latency and energy consumption are scalable for every single participant.