Giter VIP home page Giter VIP logo

augcog / isaacs_interface Goto Github PK

View Code? Open in Web Editor NEW
11.0 9.0 6.0 1.08 GB

Virtual Reality Interface for Multi-UAV Teleoperation, Version 2

License: Apache License 2.0

C# 96.21% GLSL 0.12% ShaderLab 1.93% Objective-C 0.33% HLSL 0.28% JavaScript 0.59% Objective-C++ 0.53%
radiation-detection radiation-mapping path-planning vr drones immersive control-systems user-interface research berkeley

isaacs_interface's Introduction

Immersive Semi-Autonomous Aerial Command System

Virtual Reality Interface for Multi-UAV Teleoperation, Version 2

Overview Video


ISAACS is an undergraduate-led research group within the Center for Augmented Cognition of the VHL Vive Center for Enhanced Reality at the University of California, Berkeley. Our research is in human-UAV interaction, with a focus on teleoperation, telesensing, and multi-agent interaction. We are also part of the student group Extended Reality @ Berkeley, and collaborate with the Lawrence Berkeley National Laboratory to perform 3D reconstruction of the environment via state-of-the-art methods in radiation detection. Our vision is to create a scalable open source platform for Beyond Line of Sight Flight compatible with any UAV or sensor suite.


Table of Contents

  1. Hardware Setup
  2. Software Setup
  3. Installation and Deployment
  4. Meet the Team
  5. Acknowledgments
  6. Licensing


1. Hardware Dependencies and Setup

You will need the following to run the system in simulation:

  • DJI Matrice 210 Quadrotor
  • DJI Matrice 210 RTK
  • 2x Matrice 210 Quadrotor Batteries
  • 1x Matrice 210 RTK Battery
  • DJI Matrice Manifold 2 Onboard Computer
  • 1x USB 3.0 to TTL Cable
  • 1x USB 3.0 to USB 3.0 Cable
  • Oculus Rift Virtual Reality Headset
  • VR-Ready Computer (we suggest a GeForce GTX 970 Graphics Card or better)
  • An Ethernet Cable

Additionally, to fly the UAV in real space, you will need:

  • DJI Matrice RTK GPS Station
  • 1x USB 3.0 Wi-Fi Card
  • 1x Matrice 210 Quadrotor Battery
  • A Wi-Fi Source


You will have to connect the Manifold USB 3.0 port to the Matrice 210 UART port, using the USB 3.0 to TTL Cable. Refer to this page for more information. Unlike what is described in the DJI documentation, we found out than on our Matrice 210, the TX and RX pins where inverted, meaning that TX is the white pin, RX is the green pin, and GND is the black pin. You also want to make sure that the gray Power slider is slided all the way to the left.

You will moreover need to plug-in, into the Manifold, a USB 3.0 Wi-Fi card (if you plan on flying the UAV in real space), or an Ethernet cable (if you only plan on running the system in simulation). Also, to facilitate the next steps, you may want to connect the manifold to a keyboard and a screen, using an HDMI cable. If not, you can always SSH into it.

Once you have done the above, place two batteries in the UAV and plug-in the Manifold power cord. Then, double-press and hold the orthogonal white button in front of the Matrice 210 UAV, and finally press and hold the PWR button of the Manifold. If everything went well, the UAV will play a sound, and the Manifold computer will boot.

2. Software Dependencies and Setup

The system uses two computers, one attached to the UAV, which we call Manifold, and one running the VR interface, which we call VR-Ready Computer. You may also use a third computer to run a flight simulation using the DJI Assistant 2 for Matrice, but this can be done on the VR-Ready Computer simultaneously as the frontend application is running. The Manifold backend depends on ROS Kinetic, which requires Ubuntu 16.04 (Xenial), or another Debian-based GNU/Linux distribution. You will furthermore need the ROS DJI SDK, and a Rosbridge Server. The frontend interface depends on Unity 2017.4.32f1, and can be run on any platform, but has only been tested on Windows 10.
Although the manifold comes with most things you need installed by default, you will have to setup a ROS Workspace and the Rosbridge Server. Refer to this page for more information on how to setup a ROS Workspace.

Common Problems when Setting up a Workspace

`catkin make` does not compile
You might need to clone the nmea_msgs package into the src folder, and then try again.

I'm editing the sdk.launch file with `rosed`, but I cannot find the correct serial port
This will in most cases be /dev/ttyUSB0. If this is incorrect, then an error will pop up. To find the correct serial port:

  • $ grep -iP PRODUCT= /sys/bus/usb-serial/devices/ttyUSB0/../uevent CAUTION: there is a space between PRODUCT= and /sys'. This is not a typo.
  • $ lsusb | grep <ID> Replace <ID> with the ID found from the previous step.

I don't know what to set the Baudrate to
The Baudrate should be set to 921600. If you are using the DJI Assistant 2 for Matrice to simulate a flight, then you also need to set the same Baudrate inside the DJI Assistant 2 for Matrice app, which can be found under the SDK tab.

Connecting to the simulator and launching the SDK fails for an unknown reason
This can be due to many reasons, but generally it means tht you have to set a udev exception, and/or disable advanced sensing and connect the Manifold with the UAV with an additional USB 3.0 to USB 3.0 cable. CAUTION: disabling advanced sensing disables the Matrice 210's built-in object avoidance mechanism.

  • $ echo 'SUBSYSTEM=="usb", ATTRS{idVendor}=="2ca3", MODE="0666"' | sudo tee /etc/udev/rules.d/m210.rules
  • Change enable_adanced_sensing to false in the file DJI/catkin_ws/Onboard-SDK-ROS/dji_sdk/src/modules/dji_sdk_node.cpp

Installing the Rosbridge Server on the Manifold

$ sudo apt-get install ros-kinetic-rosbridge-server

Installing Unity on the VR-Ready Computer

Unity versions and installation instructions can be found on this page.

3. Installation and Deployment

Make sure that you red and went through the Hardware Dependencies and Software Dependencies section, before proceeding with the system installation. This is critically important; the system will not work otherwise.

Installation (Simulation only)

  1. Clone the project on the VR-Ready Computer with the following command:
    $ git clone https://github.com/immersive-command-system/ImmersiveDroneInterface_2.git
  2. Initialize Submodules: git submodule update --init --recursive
  3. Place the RTK Battery inside the RTK Controller, and turn it on.
  4. Disable RTK Signal (you may need to connect the controller to a phone or tablet with the 'DJI Go 4' app for this step)
  5. Modify the Manifold's .bashrc to source ROS environment variables:
    $ echo 'cd $HOME/DJI/catkin_ws && source devel/setup.bash' >> $HOME/.bashrc
  6. In a new terminal, start the DJI SDK:
    $ roslaunch dji_sdk sdk.launch
  7. Test if the UAV can receive Manifold instructions by running the following command (this should spin the rotors, without actually flying the drone):
    $ rosservice call /dji_sdk/sdk_control_authority 1
    $ rosservice call /dji_sdk/drone_arm_control 1
  8. If the rotor spin, great, we are almost there! Stop the rotors with the following command:
    $ rosservice call /dji_sdk/drone_arm_control 0
  9. Check that the Manifold is correctly connected to the Ethernet cable. Connect the other end of the Ethernet cable to the VR-Ready computer.
  10. Run the Rosbridge Server. This will launch a WebSocket in port 9090. If you want to use a different port, see this page.
    $ roslaunch rosbridge_server rosbridge_websocket.launch
  11. Connect the Oculus headset with the VR-Ready laptop. If you have not done so already, follow through the Oculus Rift setup.
  12. Connect the Manifold to a computer with the DJI Assistant 2 for Matrice using a USB 3.0 to USB 3.0 cable, and launch the Flight Simulator.
  13. Launch our application via Unity. Find the script named ROSDroneConnection.cs NOTE: deprecated, use the Unity GUI instead. and replace the IP address of the server with the actual IP address of the Manifold. To find the IP address of the Manifold, use the following command:
    $ hostname -I
  14. Save and close the script, and launch our application by clicking on the play (small triangle) button inside Unity. If all went well, you should see printed information that a client connected to the Rosbridge Server, inside the terminal from which the Rosbridge server was launched.
  15. Congratulations, you are ready to fly your UAV in VR!

Installation (with UAV flight)

Follow the steps 1-11 as above, skipping step 12. Then, setup the RTK GPS Station. Finally, continue with steps 13-15.

Deployment

Each time that you want to run our system, you will have to first disable the RTK signal, and then run the DJI SDK and Rosbridge Server. The routine is rather simple:

  1. Power-on the UAV, Manifold and VR-Ready Computer
  2. (Optionally) connect the UAV to the DJI Assistant 2 for Matrice, and launch the Flight Simulator
  3. Turn of the RTK signal through the 'DJI Go 4' app
  4. Launch the SDK
    $ roslaunch dji_sdk sdk.launch
  5. Launch the Rosbridge Server
    $ roslaunch rosbridge_server rosbridge_websocket.launch
  6. Open our system in Unity and click the play button

Moreover, each time your internet connection changes, you will have to change the IP address that the Unity client subscribes to.

6. Meet the Team

Current

Eric Wang
Jasmine Bae
Varun Saran
Nitzan Orr
Shreyas Krishnaswamy

Alumni

Jesse Patterson
Jessica Lee
Peru Dayani
Apollo
Ji Han
Xin Chen
Paxtan Laker
Rishi Upadhyay
Brian Wu
Eric Zhang
Newman Hu

7. Acknowledgments

We would like to thank Dr. Allen Yang and Dr. Kai Vetter for their mentorship and supervision. We would also like to thank our graduate advisors, David McPherson and Joe Menke for their continuous support.

8. Licensing

This repository is distributed under the Apache 2.0 license. All media files are distributed under the Creative Commons Attribution-ShareAlike 4.0 International license.

In case of doubt on whether you can use an asset, or on how to correctly attribute its authors, please e-mail us at: [email protected].

isaacs_interface's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

isaacs_interface's Issues

Implement two-POV system (3rd person, 1st person)

  • Toggle POVs by clicking the left thumbstick
  • 3rd person view: teleport is disabled, avatar automatically placed in the front middle of the base ("table"). Replace terrain with virtual background. Left thumbstick changes map placement.
  • 1st person view: teleport is enabled. Disable virtual background, and fully expand the map. Left thumbstick changes camera angle. Left index button automatically teleports to the position (and therefore view) of the currently selected drone.

Faces in Mesh Threshold

We may want to investigate the not enough faces threshold as sometimes blocks are cleared so we should probably put that threshold for creating a new game object but updating existing objects should be exempt.

Place google map into Unity

RFS google maps building is in Blender. Move to Unity and replace MapBox, as a new branch.

  • Find correct export format
  • Import into Unity
  • Edit scale
  • Hide mapbox

Looking into the future: API Architecture

As we move away from Matrice drones, we will have to make our control platform more modular. This entails only sending generic messages from Unity, and doing the decoding on the onboard computer (or the microcontroller).

This is more in line with ISAACS version 1. The steps needed are the following:

  1. Allow the user to input a drone type (string) through the unity interface, and select its properties through a menu similar to the one we currently have for drone instances.
  2. Generate a new class whose name is the drone type, and whose properties are the ones selected by the API user.
  3. Implement a network architecture that exchanges messages between Unity and manifolds (see ISAACS v1 for details), and remove our M100/M210/M600 scripts.
  4. Implement a wrapper that receives these messages, and converts them to ROS commands. It's up to the API user to specify these commands for their drone model (although we can have some default "profiles", such as for the M600).

LiDAR Mesh Efficiency

Speed up the LiDAR mesh rendering

  • Create separate thread for LiDAR mesh
  • Time the average mesh rendering time
  • Compare mesh rendering time to message rate
  • Double Mesh block size

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.