Giter VIP home page Giter VIP logo

shinkansan / 2019-ugrp-dpoom Goto Github PK

View Code? Open in Web Editor NEW
67.0 3.0 17.0 183.99 MB

2019 DGIST DPoom project under UGRP : SBC and RGB-D camera based full autonomous driving system for mobile robot with indoor SLAM

Home Page: https://shinkansan.github.io/2019-UGRP-DPoom/

License: Apache License 2.0

Python 99.85% Shell 0.15%
robotics ros-kinetic realsense2 autonomous-driving depth-camera robot-framework d435i slam turtlebot3 ros rtabmap path-planning hci

2019-ugrp-dpoom's Introduction

Indoor Mobile Robot Fully Autonomous Driving Project - DPoom build_status

For further information, please visit our page

Click below for introduction video.

Introduction Video

Single board computer Lattepanda Alpha based indoor mobile robot DPoom, with fully autonomous driving system using single RGB-D camera Intel Realsense D435i.

Keywords: autonomy, autonomous driving system, mobile robot, SLAM, ROS, RGB-D, Low-end, global path planning, motion planning, ground segmentaion, navigation, path tracking, control, Human-Robot Interaction

Installation

How to Run

Robot control

The robot control two dynamixel motors via ROS and OpenCR. For providing a easy way of robot control, we built a driving control package name easyGo. See Control Package Page.

SLAM

Mapping should be preceded before deploying robots. See SLAM Page.

Global path planning

Using pcd map, the robot can plan global path using our FMM based modified A*. See GPP Page.

Motion planning

The robot can follow the generated path by motion planner. Our motion planner is using our real-time ground segmentation method named MORP. See Motion Planning (MORP) Page.

Full autonomous driving

RGB-D localization, global path planning and motion planning are integrated in one python script. Just run integration.py. For details, see Integration Page

Human-Computer Interaction

See Human-Computer Interaction Page.

Related Repositories

DPoom_Gazebo

DPoom is also availble in ROS Gazebo simulation with equivalent codes. To simulate DPoom in Gazebo: DPoom_gazebo

Gazebo-CrowdNav

Our navigation method can be simulated in Gazebo. Current state-of-the-art navigation approches based on DRL (CADRL, LSTM-RL, SARL) are available with DPoom platform. To evalute navigation performance with DPoom in Gazebo: Gazebo-CrowdNav

Project Info

2019/12/5 We opened our github repos to public!!.

a a

Purpose of this project

These days mobile robots are rapidly developing in the industry. However, there are still some problems for practical application such as expensive hardware and high power consumption. In this study, we propose a navigation system, which can be operated on a low-end computer with an RGB-D camera, and a mobile robot platform to operate integrated autonomous driving system. The proposed system does not require LiDARS or GPUs. Our raw depth image ground segmentation extracts a traversability map for safe driving of the low-body mobile robots. It is designed to guarantee real-time performance on a low-cost commercial single board computer with integrated SLAM, global path planning, and motion planning. Running sensor data processing and other autonomous driving functions simultaneously, our navigation method performs fast at 18Hz refresh rate for control command, while the others have slower refresh rates. Our method outperforms current state-of-the-art navigation approaches as shown in 3D simulation tests. In addition, we demonstrate the applicability of our mobile robot system by successfully autonomous driving in a residential lobby.

Our team

Taekyung Kim / DGIST Class of 2020 @ktk1501
Seunghyun Lim / DGIST Class of 2020 @SeunghyunLim
Gwanjun Shin / DGIST undergraduate @shinkansan
Geonhee Sim / DGIST undergraduate @jane79

Platform info

SOFTWARE

  • Ubuntu 16.04 LTS
  • ROS Kinetic
  • Python 3.6
  • Python 2.7 on Robot Platform
  • Realsense SDK 2
  • Tensorflow 1.8.0
  • OpenCV

HARDWARE

  • Platform Computing Unit : Lattepanda Alpha 864
  • Intel Realsense Camera D435i
  • Turtlebot3 waffle pi
  • Outer HW printed by 3D-Printer

Honors

DPoom won the 2019 Samsung Open Source Conference (SOSCON) Robot Competition. Media coverage

Paper info

Our paper had submitted to IROS 2021

Build Status Badge by shield.io

This work is partially supported by DGIST UGRP.

2019-ugrp-dpoom's People

Contributors

jane79 avatar seunghyunlim avatar shinkansan avatar tkkim-robot avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

2019-ugrp-dpoom's Issues

How to use optimization_sample.py from pathplanning

Hello ! I'm currently trying to achieve autonomous navigation for an A1 robot (https://www.unitree.com/products/a1/) which is equipped with a RealSense 435i. So first of all, thank you for this Git, it's really helping, I've managed to get the localization from a pre-existing map file.
However, I'm currently trying to understand how the astar.py and optimization_sample.py work. I've changed the pathfile and the parameters but I don't know what I'm supposed to do. My cursor changes and allows me to select a section of my screen but I don't know what to select ? I've tried selecting different parts of lobby.jpg or even just clicking but nothing works and I end up having these errors :

Capture d’écran 2021-08-10 à 10 56 30

Any help would be appreciated, thank you very much !

Info about path planning module

Hi ,

After generating the map with the d435i and the activating of the localization node, how is it possible to use the path planning module ?
Thanks a lot
Germal

Implementation issue of the opensource_tracking.launch

In the opensource_tracking.launch file, filtered odometry given by the ukf of robot localization package has not fed into the rtabmap node. Which means rtabmap node subscribes odometry from rgbd_odometry node thorugh rtabmap/odom topic. So filtered odometry from ukf is not used in rtabmap if I am not mistaken?

Isn't the correct implementation something like following node graph?
ekf_node_graph

SLAM with D435i tutorial: how to evaluate the localization accuracy?

I am new to VSLAM. I am following the instructions of "SLAM with D435i tutorial"
roslaunch realsense2_camera opensource_tracking.launch,

I have got the 2D occupancy map of my office successfully.

As you said,”As result of comparision, error of the data is just few centimeters“

I want to know how to evaluate the localization accuracy without any GroundTruth? Could you give some suggestion?

Thx.

image

image

Links not working

Hello ! I'm trying to follow your tutorial here : https://shinkansan.github.io/2019-UGRP-DPoom/SLAM.html, but the links to download your files are no longer working (404 not found). The files are : opensource_tracking_tk_online.launch, opensource_tracking_tk_localizationl.launch, odom_listener.py and easyControl.py. So I was wondering if you could put them back, or even just send them to me. I'm very interested in what you've done !

Thanks !

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.