• 设为首页
  • 点击收藏
  • 手机版
    手机扫一扫访问
    迪恩网络手机版
  • 关注官方公众号
    微信扫一扫关注
    公众号

venkatrn/perception: Deliberative Perception for RGB-D Object Instance Localizat ...

原作者: [db:作者] 来自: 网络 收藏 邀请

开源软件名称(OpenSource Name):

venkatrn/perception

开源软件地址(OpenSource Url):

https://github.com/venkatrn/perception

开源编程语言(OpenSource Language):

C++ 74.4%

开源软件介绍(OpenSource Introduction):

Deliberative Perception for Multi-Object Recognition and Localization

Overview

This library provides implementations for single and multi-object instance localization from RGB-D sensor (MS Kinect, ASUS Xtion etc.) data. These are based on the PERCH (Perception via Search) and D2P (Discriminatively-guided Deliberative Perception) algorithms.

Requirements

  • Ubuntu 14.04+
  • ROS Hydro+ (active development only on Indigo)

Setup

  1. Get ROS Indigo from http://wiki.ros.org/indigo/Installation/Ubuntu
  2. Set up a catkin workspace ~/my_workspace (http://wiki.ros.org/catkin/Tutorials/create_a_workspace).
  3. Download the rosinstall file to your workspace.
cd ~/my_workspace
wstool init src
wstool merge -t src perch.rosinstall
wstool update -t src
rosdep install --from-paths src --ignore-src --rosdistro indigo -y
catkin_make -DCMAKE_BUILD_TYPE=Release

Demo

First, download the object CAD models (92 MB total):

roscd sbpl_perception 
chmod +x data/scripts/download_demo_models.sh
./data/scripts/download_demo_models.sh

An example RGB-D scene containing 3 objects is provided under sbpl_perception/demo. To run PERCH on this with default parameters:

roscd sbpl_perception && mkdir visualization
roslaunch sbpl_perception demo.launch 

The states 'expanded' as well as the goal state will be saved under sbpl_perception/visualization. The expanded states will also show up on an OpenCV window named "Expansions". To also save all the 'generated' (rendered) states to sbpl_perception/visualization, use

roslaunch sbpl_perception demo.launch image_debug:=true

You should see the following input depth and output depth images under sbpl_perception/visualization:

Configuration parameters for the algorithm can be found under sbpl_perception/config/demo_env_config.yaml and sbpl_perception/config/demo_planner_config.yaml, along with descriptions of those parameters.

To pull changes from all related repositories in one go, run wstool update -t src.

For more information on setting up PERCH for your custom robot/sensor, reproducing experimental results, and API details, refer to the wiki.




鲜花

握手

雷人

路过

鸡蛋
该文章已有0人参与评论

请发表评论

全部评论

专题导读
上一篇:
smarc-project/smarc_navigation: 发布时间:2022-08-16
下一篇:
yankouskia/localize-react: ✈️ Lightweight React Localization Library 发布时间:2022-08-16
热门推荐
阅读排行榜

扫描微信二维码

查看手机版网站

随时了解更新最新资讯

139-2527-9053

在线客服(服务时间 9:00~18:00)

在线QQ客服
地址:深圳市南山区西丽大学城创智工业园
电邮:jeky_zhao#qq.com
移动电话:139-2527-9053

Powered by 互联科技 X3.4© 2001-2213 极客世界.|Sitemap