openni_kinect: depth_image_proc | nite | openni | openni_camera | openni_launch | openni_tracker
Package Summary
Launch files to open an OpenNI device and load all nodelets to convert raw depth/RGB/IR streams to depth images, disparity images, and (registered) point clouds.
- Author: Patrick Mihelich
- License: BSD
- Repository: wg-kforge
- Source: hg https://kforge.ros.org/openni/openni_ros
New in ROS Electric
Contents
Overview
This package contains launch files for using OpenNI-compliant devices such as the Microsoft Kinect in ROS. It creates a nodelet graph to transform raw data from the device driver into point clouds, disparity images, and other products suitable for processing and visualization.
Quick start
Launch the OpenNI driver:
roslaunch openni_launch openni.launch
To visualize in rviz:
rosrun rviz rviz
Set the Fixed Frame (top left of rviz window) to /camera_depth_optical_frame.
Add a PointCloud2 display, and set the topic to /camera/depth/points. Turning the background to light gray can help with viewing. This is the unregistered point cloud in the frame of the depth (IR) camera. It is not matched with the RGB camera images.
Alternatively you can view the disparity image:
rosrun image_view disparity_view image:=/camera/depth/disparity
Now let's look at a registered point cloud, aligned with the RGB data. Open the dynamic reconfigure GUI:
rosrun dynamic_reconfigure reconfigure_gui
And select /camera/driver from the drop-down menu. Enable the depth_registration checkbox.
Now go back to rviz, and change your PointCloud2 topic to /camera/depth_registered/points. Set Color Transformer to RGB8. You should see a color, 3D point cloud of your scene.
Again, you can view the registered disparity image:
rosrun image_view disparity_view image:=/camera/depth_registered/disparity
To view the color image from the RGB camera outside of rviz:
rosrun image_view image_view image:=/camera/rgb/image_color
or to view the grayscale image:
rosrun image_view image_view image:=/camera/rgb/image_mono
Registration
The depth_registered/* topics can be produced in two ways.
If OpenNI registration is enabled:
The unregistered camera/depth/* topics are not published.
The raw depth image from OpenNI is published in camera/depth_registered instead.
If OpenNI registration is disabled and you have calibrated the cameras to each other:
camera/depth_registered/image_raw is computed from camera/depth/image_raw using the camera instrinsics and transform between them.
- Both registered and unregistered depth outputs are published.
Launch files
openni.launch
Launches in one process the device driver and many processing nodelets for turning the raw RGB and depth images into useful products, such as point clouds. Provides default tf tree linking the RGB and depth cameras.Arguments
camera (string, default: camera)- Descriptive name for your device. All published topics are pushed down into the <camera> namespace, and this also affects tf frame ids and some node names. If you are using multiple cameras, use this argument to disambiguate them.
- Specifies which device to open. The following formats are recognized:
#1 Use first device found 2@3 Use device on USB bus 2, address 3 B00367707227042B Use device with given serial number
- The tf frame id of the RGB camera.
- The tf frame id of the IR camera.
- Calibration URL for the RGB camera. By default, looks in your ROS home directory for calibration identified by the device serial number, e.g. $HOME/.ros/camera_info/rgb_B00367707227042B. If no calibration is found, uses a default camera model with a typical focal length and distortion unmodeled.
- Calibration URL for the IR/depth camera. By default, looks in your ROS home directory for calibration identified by the device serial number, e.g. $HOME/.ros/camera_info/depth_B00367707227042B. If no calibration is found, uses a default camera model with the focal length reported by OpenNI and distortion unmodeled.
- Remap the rgb namespace.
- Remap the ir namespace.
- Remap the depth namespace.
- Remap the depth_registered namespace.
- If true, launches nodelet manager in GDB for debugging.
Published Topics
RGB camera
- Camera calibration and metadata.
- Raw image from device. Format is Bayer GRBG for Kinect, YUV422 for PSDK.
- Monochrome unrectified image.
- Color unrectified image.
- Monochrome rectified image.
- Color rectified image.
Depth camera
- Camera calibration and metadata.
- Raw image from device. Contains uint16 depths in mm.
- Unrectified depth image. Contains float depths in m.
- Rectified depth image. Contains float depths in m.
- Disparity image (inversely related to depth), for interop with stereo processing nodes.
- XYZ point cloud. If using PCL, subscribe as PointCloud<PointXYZ>.
Registered depth camera (aligned with RGB camera)
See Registration.- Camera calibration and metadata. Same as camera/rgb/camera_info but time-synced to depth images.
- Raw image from device. Contains uint16 depths in mm.
- Unrectified depth image. Contains float depths in m.
- Rectified depth image. Contains float depths in m.
- Disparity image (inversely related to depth), for interop with stereo processing nodes.
- XYZRGB point cloud. If using PCL, subscribe as PointCloud<PointXYZRGB>.
IR camera
- Camera calibration and metadata.
- Raw uint16 IR image.
- Rectified IR image.
Provided tf Transforms
/<camera>_rgb_optical_frame → /<camera>_depth_optical_frame- Default estimate for the transform between the RGB and IR cameras. If OpenNI registration is disabled, this is used with the calibrated camera intrinsics to perform registration.