Giter VIP home page Giter VIP logo

knut0815 / aws-deepracer-inference-pkg Goto Github PK

View Code? Open in Web Editor NEW

This project forked from aws-deepracer/aws-deepracer-inference-pkg

0.0 0.0 0.0 30 KB

The DeepRacer Inference ROS package creates the inference_node that is responsible for running the inference on the machine learning model that is selected using the Intel OpenVino Inference Engine APIs.

License: Apache License 2.0

CMake 4.64% C++ 92.20% Python 3.16%

aws-deepracer-inference-pkg's Introduction

DeepRacer Inference Package

Overview

The DeepRacer Inference ROS package creates the inference_node which is part of the core AWS DeepRacer application and will be launched from the deepracer_launcher. More details about the application and the components can be found here.

This node is responsible for running the inference on the model that is selected using the Intel OpenVino Inference Engine APIs.

More details about the Intel OpenVino Inference Engine can be found here: https://docs.openvinotoolkit.org/2021.1/openvino_docs_IE_DG_Deep_Learning_Inference_Engine_DevGuide.html

License

The source code is released under Apache 2.0 (https://aws.amazon.com/apache-2-0/).

Installation

Prerequisites

The DeepRacer device comes with all the pre-requisite packages and libraries installed to run the inference_pkg. More details about pre installed set of packages and libraries on the DeepRacer, and installing required build systems can be found in the Getting Started section of the AWS DeepRacer Opensource page.

The inference_pkg specifically depends on the following ROS2 packages as build and execute dependencies:

  1. deepracer_interfaces_pkg - This packages contains the custom message and service type definitions used across the AWS DeepRacer core application.
  2. cv_bridge - This contains CvBridge, which converts between ROS Image messages and OpenCV images.
  3. image_transport - It provides transparent support for transporting images in low-bandwidth compressed formats.
  4. sensor_msgs - This package defines messages for commonly used sensors, including cameras and scanning laser rangefinders.

Downloading and Building

Open up a terminal on the DeepRacer device and run the following commands as root user.

  1. Switch to root user before you source the ROS2 installation:

     sudo su
    
  2. Source the ROS2 Foxy setup bash script:

     source /opt/ros/foxy/setup.bash 
    
  3. Set the environment variables required to run Intel OpenVino scripts:

     source /opt/intel/openvino_2021/bin/setupvars.sh
    
  4. Create a workspace directory for the package:

     mkdir -p ~/deepracer_ws
     cd ~/deepracer_ws
    
  5. Clone the inference_pkg on the DeepRacer device:

     git clone https://github.com/aws-deepracer/aws-deepracer-inference-pkg.git
    
  6. Fetch unreleased dependencies:

     cd ~/deepracer_ws/aws-deepracer-inference-pkg
     rosws update
    
  7. Resolve the inference_pkg dependencies:

     cd ~/deepracer_ws/aws-deepracer-inference-pkg && rosdep install -i --from-path . --rosdistro foxy -y
    
  8. Build the inference_pkg and deepracer_interfaces_pkg:

     cd ~/deepracer_ws/aws-deepracer-inference-pkg && colcon build --packages-select inference_pkg deepracer_interfaces_pkg
    

Usage

The inference_node provides a very specific and core functionality to run inference on the Reinforcement learning models that are trained on the AWS DeepRacer Simulator. Intel OpenVino provides APIs to load an intermediate representation file for the model and create a core object which can be used to run the inference. Although the node is built to work with the AWS DeepRacer application, it can be run independently for development/testing/debugging purposes.

Run the node

To launch the built inference_node as root user on the DeepRacer device open up another terminal on the DeepRacer device and run the following commands as root user:

  1. Switch to root user before you source the ROS2 installation:

     sudo su
    
  2. Source the ROS2 Foxy setup bash script:

     source /opt/ros/foxy/setup.bash 
    
  3. Set the environment variables required to run Intel OpenVino scripts:

     source /opt/intel/openvino_2021/bin/setupvars.sh
    
  4. Source the setup script for the installed packages:

     source ~/deepracer_ws/aws-deepracer-inference-pkg/install/setup.bash  
    
  5. Launch the inference_pkg using the launch script:

     ros2 launch inference_pkg inference_pkg_launch.py
    

Launch Files

The inference_pkg_launch.py is also included in this package that gives an example of how to launch the nodes independently from the core application.

from launch import LaunchDescription
from launch_ros.actions import Node

def generate_launch_description():
    return LaunchDescription([
        Node(
            package='inference_pkg',
            namespace='inference_pkg',
            executable='inference_node',
            name='inference_node'
        )
    ])

Node Details

inference_node

Subscribed Topics

Topic Name Message Type Description
/sensor_fusion_pkg/sensor_msg EvoSensorMsg Message with the combined sensor data. Contains single camera/two camera images and LiDAR distance data.

Published Topics

Topic Name Message Type Description
/inference_pkg/rl_results InferResultsArray Publish a message with the reinforcement learning inference results with class probabilities for the state input passed through the current model that is selected in the device console.

Services

Service Name Service Type Description
load_model LoadModelSrv Service that is responsible for setting pre-processing algorithm and inference task for the specific type of model loaded.
inference_state InferenceStateSrv Service that is responsible for starting and stopping inference tasks.

Resources

aws-deepracer-inference-pkg's People

Contributors

amazon-auto avatar pratik-nichat avatar siddalingesha-ds avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.