Robotics: Plug-and-play integration of the 3D vision sensor possible thanks to newly developed software
Universal Robots (UR) from Denmark are worldwide pioneers in the field of modular and cost-efficient lightweight robots. In particular small and medium-sized companies combine cooperative robots individually according to the modular system to cooperate hand in hand with the expert staff. There are numerous applications for these robots. For example, they assume pick-and-place tasks in the fields of palletising and packaging or in the fields of assembly or machine tending.
3D vision sensor from ifm as UR+ solution
The robot “modular system” is complemented by components from other manufacturers which were tested for their compatibility and certified by Universal Robots. These are, for example, grippers, sensors, actuators or vision systems. ifm as one of the worldwide leading suppliers of automation solutions provides its O3D 3D vision sensor as system component in the UR modular system. The core element of the sensor is a 3D camera chip. It creates a 3D image using PMD technology (= photonic mixer device) and time of flight measurement. The resolution of the PMD image sensor is 176 by 132 pixels. For each of the 23,232 pixels the sensor calculates a precise distance value – up to 25 times per second. As opposed to laser scanners, ifm’s 3D sensor does not have any moving parts. Therefore, it is especially robust, small and cost effective. The image is evaluated in the sensor so that no other external components are needed.
URCaps plugin for easy integration
For easy integration of the 3D vision sensor, the Danish Technological Institute (DTI) has developed the software plugin “URCap” in cooperation with ifm. This plugin features not only the interface as such but also an easy-to-handle graphical user interface. URCap as plug-and-play solution ensures direct communication of the ifm sensor with the robot controller. The special benefit is its ease of handling: The user does not have to carry out complex programming; only parameter settings are required. They can be easily taught via the UR operator terminal thanks to consistent software integration.
After teaching, the O3D sensor can be perfectly used for gripper navigation. It detects the object position, even when objects are moving, and transmits it to the robot control, which controls the gripper. The system can detect rectangular, round and irregular shapes and transmit not only the position of their centre of gravity, but also the number and dimensions of the detected objects to the controller.
Conclusion: Thanks to the perfect interaction between powerful hardware and easy-to-handle software, the users can now easily implement leading vision sensors in their gripper applications of Universal Robots.