rc_visard Documentation Revision 1.8.
Contents 1 . . . . 1 2 3 4 6 2 Safety 2.1 General warnings . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2.2 Intended use . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8 8 9 3 Hardware specification 3.1 Scope of delivery . . . . . . . . . . . . 3.2 Technical specification . . . . . . . . . 3.3 Environmental and operating conditions 3.4 Power-supply specifications . . . . . . 3.5 Wiring . . . . . . . . . . . . . . . .
8 9 Interfaces 8.1 GigE Vision 2.0/GenICam image interface 8.2 REST-API interface . . . . . . . . . . . . 8.3 The rc_dynamics interface . . . . . . . . . 8.4 KUKA Ethernet KRL Interface . . . . . . 8.5 Time synchronization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
1 Introduction Revisions This product may be modified without notice, when necessary, due to product improvements, modifications, or changes in specifications. If such modification is made, the manual will also be revised; see revision information. Documentation Revision 1.8.1-2-g3c019b5 Dec 12, 2019 Applicable to rc_visard firmware 1.8.x Copyright This manual and the product it describes are protected by copyright.
1.1 Overview The 3D sensor rc_visard provides real-time camera images and disparity images, which are also used to compute depth images and 3D point clouds. Additionally, it provides confidence and error images as quality measures for each image acquisition. The sensor provides self-localization based on image and inertial data. A mobile navigation solution can be established with the optional on-board SLAM module.
1.2 Warranty Any changes or modifications not expressly approved by Roboception could void the user’s warranty and guarantee rights. Warning: The rc_visard sensor utilizes complex hardware and software technology that may not always function as intended. The purchaser must design its application to ensure that any failure or the rc_visard sensor does not cause personal injury, property damage, or other losses. Warning: Do not attempt to take apart, open, service, or modify the rc_visard.
1.3 Applicable standards 1.3.1 Interfaces The rc_visard supports the following interface standards: The Generic Interface for Cameras standard is the basis for plug & play handling of cameras and devices. GigE Vision® is an interface standard for transmitting high-speed video and related control data over Ethernet networks. 1.3.
• CISPR 24 : 2015 +A1:2015 International special committee on radio interference, Information technology equipment-Immunity characteristics-Limits and methods of measurement • EN 61000-6-2 : 2005 Electromagnetic compatibility (EMC) Part 6-2:Generic standards - Immunity for industrial environments • EN 61000-6-3 : 2007+A1:2011 Electromagnetic compatibility (EMC) - Part 6-3: Generic standards - Emission standard for residential, commercial and light-industrial environments 1.3.
1.4 Glossary DHCP The Dynamic Host Configuration Protocol (DHCP) is used to automatically assign an IP address to a network device. Some DHCP servers only accept known devices. In this case, an administrator needs to configure the DHCP server with the fixed MAC address of a device. DNS mDNS The Domain Name Server (DNS) manages the host names and IP addresses of all network devices. It is responsible for resolving the host name into the IP address for communication with a device.
use datagram sockets to bind to the endpoint of the data transmission consisting in a combination of an IP address and a service port number such as 192.168.0.100:49500, which is typically referred to as a destination of an rc_dynamics data stream in this documentation. URI URL A Uniform Resource Identifier (URI) is a string of characters identifying resources of the rc_visard’s RESTAPI.
2 Safety Warning: The operator must have read and understood all of the instructions in this manual before handling the rc_visard sensor.
Risk assessment and final application: The rc_visard may be used on a robot. Robot, rc_visard, and any other equipment used in the final application must be evaluated with a risk assessment. The system integrator’s duty is to ensure respect for all local safety measures and regulations. Depending on the application, there may be risks that need additional protection/safety measures. 2.2 Intended use The rc_visard is intended for data acquisition (e.g.
3 Hardware specification Note: The following hardware specifications are provided here as a general reference; differences with the product might exist. 3.1 Scope of delivery Standard delivery for an rc_visard includes the rc_visard sensor and a quickstart guide only. The full manual is available in digital form and is always installed on the sensor, accessible through the Web GUI (Section 4.5), and available at http://www.roboception.com/documentation.
3.2 Technical specification The common technical specifications for both rc_visard variants are given in Table 3.2.1. Table 3.2.
37.5 75 65 21.5 32.5 74.5 (96) 135 Fig. 3.2.1: Overall dimensions of the rc_visard 65 37.5 75 160 21.5 80 62.5 (84) 230 Fig. 3.2.2: Overall dimensions of the rc_visard 160 3.2.
CAD models of the rc_visard can be downloaded from http://www.roboception.com/download. The CAD models are provided as-is, with no guarantee of correctness. When a material property of aluminium is assigned (density g of 2.76 cm 3 ), the mass properties of the CAD model are within 5% of the product with respect to weight and center of mass, and within 10% with respect to moment of inertia. 3.3 Environmental and operating conditions The rc_visard is designed for industrial applications.
Warning: Exceeding maximum power rating values may lead to damage of the rc_visard, power supply, and connected equipment. Warning: A separate power supply must power each rc_visard. Warning: Connection to domestic grid power is allowed through a power supply certified as EN55011 Class B only. 3.5 Wiring Cables are not provided with the rc_visard standard package. It is the customer’s responsibility to obtain the proper cabling. Accessories (Section 10) provides an overview of suggested components.
1 7 8 6 5 4 4 5 8 6 7 2 3 3 2 1 Ethernet M12 8-pin socket connector A-coded, view onto camera Power/GPIO M12 8-pin plug connector A-coded, view onto camera Fig. 3.5.2: Pin positions for power and Ethernet connector Pin assignments for the Ethernet connector are given in Fig. 3.5.3. M12 RJ45 6 4 5 8 1 7 2 3 1 2 3 6 5 4 7 8 WH-OG OG WH-GN GN WH-BU BU WH-BN BN Fig. 3.5.3: Pin assignments for M12 to Ethernet cabling Pin assignments for the power connector are given in Table 3.5.1. Table 3.5.
GPIO circuitry and specifications are shown in Fig. 3.5.4. The maximum rated voltage for GPIO In and GPIO Vcc is 30 V. 2k GPIO_In1 2k GPIO_In2 GPIO_GND GPIO In: Uin_low = 0 VDC Uin_high = 11VDC to 30 VDC Iin = 5mA to 13 mA GPIO_Power_Vcc 180 180 GPIO_Out1 GPIO_Out2 GPIO Out: Uext = 5VDC to 30 VDC Iout = max 50 mA Fig. 3.5.4: GPIO circuitry and specifications – do not connect signals higher than 30 V Warning: Do not connect signals with voltages higher than 30 V to the rc_visard. 3.
UNC 1/4"-20, thread depth = 5 50 +0.05 ad thre 6 , 4 3xM epth = d 4 28 X 5 28 Z 3x M4 mounting threads for dynamic applications 5 4+0.05 5 Optical axis Fig. 3.6.1: Mounting-point for connecting the rc_visard to robots or other mountings For troubleshooting and static applications, the sensor may be mounted using the standardized tripod thread (UNC 1/4”-20) indicated at the coordinate-frame origin.
~31.5 75 65 37.5 x z y y z 21.5 x 28 32.5 135 y y 74.5 (96) Fig. 3.7.1: Approximate location of sensor/camera coordinate frame (inside left lens) and mounting-point frame (at tripod thread) for the rc_visard 65 Approximate locations of sensor/camera coordinate frame and mounting-point frame for the rc_visard 160 are shown in Fig. 3.7.2. ~31.5 x y y 21.5 z 28 y x 62.5 y 37.5 z 75 160 80 230 (84) Fig. 3.7.
4 Installation Warning: The instructions on Safety (Section 2) related to the rc_visard must be read and understood prior to installation. 4.1 Installation and configuration The rc_visard offers a Gigabit Ethernet interface for connecting the device to a computer network. All communications to and from the device are performed via this interface. The rc_visard has an on-board computing resource that requires booting time after powering up the device. 4.
4.3.1 Automatic configuration (factory default) The Dynamic Host Configuration Protocol (DHCP) is preferred for setting an IP address. If DHCP is active on the rc_visard, which is the factory default, then the device tries to contact a DHCP server at startup and every time the network cable is plugged in. If a DHCP server is available on the network, then the IP address is automatically configured. In some networks, the DHCP server is configured so that it only accepts known devices.
Fig. 4.4.1: rcdiscover-gui tool for finding connected rc_visard devices After successful discovery, a double click on the device row opens the Web GUI (Section 4.5) of the device in the operating system’s default web browser. Mozilla Firefox is recommended as web browser. 4.4.1 Resetting configuration A misconfigured device can be reset by using the Reset rc_visard button in the discovery tool. The reset mechanism is only available for two minutes after device startup.
4.5 Web GUI The rc_visard’s Web GUI can be used to test, calibrate, and configure on-board processing. It can be accessed from any web browser, such as Firefox, Google Chrome, or Microsoft Edge, via the sensor’s IP address. The easiest way to access the Web GUI is to simply double click on the desired device using the rcdiscover-gui tool as explained in Discovery of rc_visard devices (Section 4.4).
• The Dynamics module shows the location and movement of image features that are used to compute the rc_visard’s egomotion. Settings include the number of corners and features that should be used. See Parameters (Section 6.4.1) for more information. • The Camera Calibration module permits the camera to be checked for proper calibration. In rare cases when the camera is no longer sufficiently calibrated, calibration also can be performed using this module. See Camera calibration (Section 6.
5 The rc_visard in a nutshell The rc_visard is a self-registering 3D camera. It provides rectified camera, disparity, confidence, and error images, which enable the viewed scene’s depth values along with their uncertainties to be computed. Furthermore, the motion of visual features in the images is combined with acceleration and turn-rate measurements at a high rate, which enables the sensor to provide real-time estimates of its current pose, velocity, and acceleration. 5.
For stereo matching, the position and orientation of the left and right cameras relative to each other has to be known with very high accuracy. This is achieved by calibration. The rc_visard’s cameras are pre-calibrated during production. However, if the rc_visard has been decalibrated, during transport for example, then the user has to recalibrate the stereo camera.
gives global orientation in the vertical direction. Further, IMU measurements have a high rate of 200 Hz. The rc_visard’s linear velocity, position, and orientation can be computed by integrating the IMU measurements. However, the integration results suffer from increasing drift over time.
6 Software components The rc_visard comes with several on-board software components, which provide camera images, 3D information, and dynamics state estimates, and allow calibration to be performed. Each software component corresponds to a node in the REST-API interface (Section 8.2). Fig. 6.1 gives an overview of the relationships between the different software components and the data they provide via rc_visard’s various interfaces (Section 8).
– Visual odometry (rc_stereovisodo, Section 6.4) estimates the motion of the rc_visard device based on the motion of characteristic visual features in the left camera images. – Stereo INS (rc_stereo_ins, Section 6.5) combines visual odometry measurements with readings from the on-board Inertial Measurement Unit (IMU) to provide accurate and highfrequency state estimates in real time. • Camera calibration (rc_cameracalib, Section 6.
The focal length 𝑓 is the distance between the common image plane and the optical centers of the left and right cameras. It is measured in pixels. The baseline 𝑡 is the distance between the optical centers of the two cameras. The image width 𝑤 and height ℎ are measured in pixels, too. 𝑠1 and 𝑠2 are scale factors ensuring that the third coordinates of the image points 𝑝𝑙 and 𝑝𝑟 are equal to 1.
Name baseline color exp focal fps gain height temp_left temp_right time width Table 6.1.2: The rc_stereocamera component’s status values Description Stereo baseline 𝑡 in meters 0 for monochrome cameras, 1 for color cameras Actual exposure time in seconds. This value is shown below the image preview in the Web GUI as Exposure (ms). Focal length factor normalized to an image width of 1 Actual frame rate of the camera images in Hertz. This value is shown in the Web GUI below the image preview as FPS (Hz).
Description of run-time parameters Fig. 6.1.1: The Web GUI’s Camera tab fps (FPS) This value is the cameras’ frame rate (fps, frames per second), which determines the upper frequency at which depth images can be computed. This is also the frequency at which the rc_visard delivers images via GigE Vision. Reducing this frequency also reduces the network bandwidth required to transmit the images.
Internal acquisition Camera image Fig. 6.1.2: Images are internally always captured with 25 Hz. The fps parameter determines how many of them are sent as camera images via GigE Vision. exp_auto (Exposure Auto or Manual) This value can be set to 1 for auto-exposure mode, or to 0 for manual exposure mode. In manual exposure mode, the chosen exposure time is kept, even if the images are overexposed or underexposed.
Warning: The user must be aware that by calling this service, the current parameter settings for the camera component are irrecoverably lost. This service requires no arguments. This service returns no response. 6.2 Stereo matching The stereo matching component uses the rectified stereo-image pair and computes disparity, error, and confidence images. 6.2.
where 𝑓 is the focal length after rectification in pixels and 𝑡 is the stereo baseline in meters, which was determined during calibration. These values are also transferred over the GenICam interface (see Custom GenICam features of the rc_visard, Section 8.1.1). Note: The rc_visard reports a focal length factor via its various interfaces. It relates to the image width for supporting different image resolutions.
The rc_visard provides time-stamped disparity, error, and confidence images over the GenICam interface (see Chunk data, Section 8.1.1). Live streams of the images are provided with reduced quality in the Web GUI (Section 4.5). 6.2.4 Parameters The stereo matching component is called rc_stereomatching in the REST-API and it is represented by the Depth Image tab in the Web GUI (Section 4.5). The user can change the stereo matching parameters there, or use the REST-API (REST-API interface, Section 8.
Table 6.2.2: The rc_stereomatching component’s status values Name Description fps Actual frame rate of the disparity, error, and confidence images. This value is shown in the Web GUI below the image preview as FPS (Hz).
Fig. 6.2.2: The Web GUI’s Depth Image tab acquisition_mode (Acquisition Mode) The acquisition mode can be set to Continuous, Single or Single + Out1. The first one is the default, which performs stereo matching continuously according to the user defined frame rate and the available computation resources. The two other modes perform stereo matching upon each click of the Acquire button.
set to ExposureAlternateActive upon each trigger call and reset to Low after receiving images for stereo matching. Note: The Single + Out1 mode can only change the out1_mode if the IOControl license is available on the rc_visard. quality (Quality) Disparity images can be computed in different resolutions: high (640 x 480), medium (320 x 240) and low (214 x 160). The lower the resolution, the higher the frame rate of the disparity image. A 25 Hz frame rate can be achieved only at the lowest resolution.
mindepth (Minimum Distance) The minimum distance is the smallest distance from the sensor at which mea- surements should be possible. Larger values implicitly reduce the disparity range, which also reduces the computation time. The minimum distance is given in meters. maxdepth (Maximum Distance) The maximum distance is the largest distance from the sensor at which mea- surements should be possible. Pixels with larger distance values are set to invalid in the disparity image.
6.3 Sensor dynamics The dynamics component provides estimates of the sensor state. These include pose, linear velocity, linear acceleration, and rotational rates. The component handles starting and stopping, and streaming of the estimates for individual subcomponents: • Visual odometry (rc_stereovisodo) estimates the camera’s motion from the motion of characteristic image points in the left camera images (Section 6.4).
Warning: The stereo INS component self-calibrates the IMU during its initialization. It is therefore required that the rc_visard is not moving and sufficient texture is visible during startup of the stereo INS component. 6.3.2 Available state estimates The rc_visard provides seven different kinds of timestamped state-estimate data streams via the rc_dynamics interface (see The rc_dynamics interface, Section 8.
• linear velocities v = (𝑣𝑥 , 𝑣𝑦 , 𝑣𝑧 )𝑇 in 𝑚 𝑠 , • angular velocities 𝜔 = (𝜔𝑥 , 𝜔𝑦 , 𝜔𝑧 )𝑇 in 𝑟𝑎𝑑 𝑠 , • gravity-compensated linear accelerations a = (𝑎𝑥 , 𝑎𝑦 , 𝑎𝑧 )𝑇 in 𝑚 𝑠2 , and • transformation from camera to IMU coordinate frame as pose with frame name and parent frame name. For each component, the stream also provides the name of the coordinate frame in which the values are given.
This service requires no arguments. This service returns the following response: { "accepted": "bool", "current_state": "string" } stop Stops the stereo INS and – if running – the SLAM components. The trajectory estimate of the SLAM component will still be available. Transitions from state RUNNING or RUNNING_WITH_SLAM through STOPPING to IDLE. This service requires no arguments.
Fig. 6.3.2: Simplified state and transition diagram These services shall respond quickly. Therefore, for services that cause a state transition the value of the returned current_state in general is the first new (intermediate) state that was transitioned to, not the final state. E.g., for the start command the returned current_state will be WAITING_FOR_INS, not state RUNNING. If the transition does not take place within 0.1 seconds, the current state is returned. See Table 6.3.
6.4 Visual odometry Visual odometry is part of the sensor dynamics component. It is used to estimate the camera’s motion from the motion of characteristic image points (so-called image features) in left camera images. Image features are computed from image corners, which are image regions with high intensity gradients. Image features are used to look for matches between subsequent images to find correspondences. Their 3D coordinates are computed by stereo matching (independent from the disparity image).
Description of run-time parameters Run-time parameters influence the number of features used to compute visual odometry. More features increase the visual odometry’s robustness at the expense of more run time, which can reduce the frame rate. Although the resulting state estimate will always have a high frequency due to fusion with IMU measurements, high visualodometry frame rates are nevertheless desirable, since these measurements are much more accurate than IMU measurements alone.
start (Dynamics) This starts the sensor dynamics estimation components (see Services, Section 6.3.3). disprange (Disparity Range) The disparity range gives the maximum disparity value for each image feature re- lated to the resolution of the high-quality disparity image (640 x 480 pixels). The disparity range determines the minimum working distance of the visual odometry. When the disparity range is narrow, only more distant features are considered in the visual-odometry estimation.
which measure accelerations and turn rates in all three dimensions. By fusing IMU and visual-odometry measurements, the state estimate has the same frequency as the IMU (200 Hz) and is very robust even under challenging lighting conditions and for fast motions. Note: To achieve high-quality pose estimates, it must be ensured that sufficient texture is visible during runtime of the stereo INS component.
Under normal conditions, such as the absence of mechanical impact on the rc_visard, self-calibration should never occur. Self-calibration allows the rc_visard to work normally even after misalignment is detected, since it is automatically corrected. Nevertheless, checking camera calibration manually is recommended if the update counter is not 0. 6.6.2 Calibration process Manual calibration can be done through the Web GUI’s Camera Calibration tab.
Step 2: Verify calibration In the second step, the current calibration can be verified. To perform the verification, the grid must be held such that it is simultaneously visible in both cameras. Make sure that all black squares of the grid are completely visible and not occluded. A green check mark overlays each correctly detected square. The correct detection of the grid is only possible if all of the black squares are detected.