Giter VIP home page Giter VIP logo

buzzmobile's Introduction

bzm buzzmobile Build Status

An autonomous parade vehicle, modeled after Georgia Tech's Rambling Wreck

Architecture

A list of available nodes and an overview of the architecture is available here.

architecture

Environment

To get started, be running Ubuntu 14.04 (required for ROS Indigo) with python 2.7 installed. Then run:

mkdir -p ~/catkin_ws/src
cd ~/catkin_ws/src
git clone https://github.com/gtagency/buzzmobile.git
cd buzzmobile
./install

If you already have ROS Indigo installed, you can just clone directly into ~/catkin/src/ and run the install script. If you want to manually install ROS Indigo you can follow the tutorial here.

The install script will create a virtualenv, install system dependencies (including ROS Indigo), install python dependencies, build the package, and source all required files. It will also put the rosinit, rosdevel and rosvenv aliases in your .bashrc.

To use the google maps api, you'll need two api keys. Put one under buzzmobile/sense/maps_querier/googlemapskey.py and one under buzzmobile/tools/route_mapper/googlemapskey.py as shown below. Note that the keys need to have proper permissions set in the Google API Console, for use of the Google Maps API and the Google Maps Static API, respectively.

googlemapskey='your_secret_api_key'

To use the gps and the lidar nodes, you will need user permissions to directly access the usb ports for gps and lidar. For that, do:

sudo usermod -aG dialout <YOUR USERNAME>

You will then need to log in and out again. Simply starting a new terminal is not sufficient. The Linux kernel will not refresh groups until the user completely logs out and logs in again.

Running

Make sure you are running inside the virtualenvironment, or things will appear broken: rosvenv.

To start running, you must first run catkin_make from the ~/catkin_ws dir. If there aren't any issues with the build, you can run roscore to start the main ROS process.

Create new tabs for every node and run them as such:

rosrun buzzmobile image_const.py
rosrun buzzmobile edge_detector
rosparam set usb_cam/pixel_format yuyv
rosrun usb_cam usb_cam_node

Note that rospy nodes don't require catkin_make to run, but do require the .py extension. If you're writing a new rospy node, also make sure the file is made executable, and has #!/usr/bin/env python as its first line.

chmod +x path/to/rospy_node.py  # make sure file has shebang
rosrun buzzmobile rospy_node.py

Some nodes require parameters that are defined in the buzzmobile/constants.yaml file. To load those constants as rosparams, do:

rosparam load ~/catkin_ws/src/buzzmobile/buzzmobile/constants.yaml

If you want to visualize your nodes, you can run the ROS visualizer (rviz), image_view, or rqt_gui. These three have different ways of visualizing messages being published:

rosrun rviz rviz
rosrun image_view image_view image:=some_imgmsg
rosrun rqt_gui rqt_gui

To load the buzzmobile mission control, load rqt_gui with the mission control perspective (configuration) file:

rosrun rqt_gui rqt_gui --perspective-file=buzzmobile/tools/mission_control/Default.perspective

To run the GPS node, do:

rosrun nmea_navsat_driver nmea_serial_driver _port:=/dev/ttyUSB0 _baud:=4800

To run the Lidar node, do:

rosrun hokuyo_node hokuyo_node port:=/dev/ttyACM0

Note that /dev/ttyUSB0 and /dev/ttyACM0 are the default serial ports for GPS and Lidar respectively. These may or may not be different. Here are some useful commands for debugging if things aren't set up correctly:

ls -l /dev/ttyACM0  # List permissions. Will output failure if /dev/ttyACM0 is not set.
sudo chmod a+rw /dev/ttyACM0  # Sets read/write permissions for all users, not recommended.

If you need to mock a polyline, this tool will be useful: google polyline util

Recording

If you want to record the messages being outputted by certain nodes, you can use rosbag:

mkdir ~/bagfiles
cd ~/bagfiles
rosbag record -O filename /message/name

To see info about the recorded data, do rosbag info filename.bag

To play the data (and publish those messages), do rosbag play test.bag. To play in a loop, just add the -l flag.

Developing

If you ever need to add ros dependencies, add them to buzzmobile/packages.xml and install them with:

cd ~/catkin_ws/src
rosdep install -y --from-paths ./buzzmobile/buzzmobile --ignore-src --rosdistro=indigo

If you need to add python deps, make sure you're in the virtual environment (rosvenv), then add the dep to buzzmobile/setup.py and do:

cd ~/catkin_ws/src/buzzmobile
pip install -e buzzmobile

Alternatively, you can update both ros and python deps using:

./ci_scripts/update_deps

Starting Car

To start car and prepare it for driving, perform the following steps:

  1. Connect the battery
  2. Flip the switch inside to turn the car on
  3. Ensure all e-stops are disabled. (Red buttons on front and back in out position and enabled via remote)
  4. Press the green button to start the motors (Car is now live)

The car starts in START mode where it receives no information. It must be switched to MANUAL or AUTO for it to drive. See 'Manual Mode Controls' for details on switching modes and operating the car.

Manual Mode Controls

The controller_node node outputs a CarPose message and a CarState message determined by input from a PS4 controller. To control these messages and operate the car manually using the controller, use the following controls:

  • Left Joystick: Change steering angle
  • R2 (Right Trigger): Change velocity
  • Square: Enable reverse. When this is held down, velocity is negated meaning the car will accelerate backwards.
  • X: Honk the horn
  • Home Button: Switch the car between AUTO and MANUAL modes.

The car starts in START mode. When pressed it will switch to MANUAL mode. Every subsequent press toggles between AUTO and MANUAL mode.

Testing

Testing is done with pytest. To run tests, you can run the script in ci_scripts/unittest, which will run all unit tests. If you want to run a specific test, make sure that the environment is initialized (run rosvenv, rosinit, and rosdevel) then run:

cd ~/catkin_ws/src/buzzmobile/buzzmobile
pytest tests/unit/path/to/test.py

Be careful! Make sure not to run any tests from outside the inner buzzmobile/ dir, as that creates a name conflict in tests that try importing buzzmobile.std_msgs.

Also make sure not to run tests from the root directory of the project, as there is a virtualenv there: running pytest will attempt to run thousands of unittests included with the python interpreter.

To write additional unit tests, please place within buzzmobile/tests/unit in directories that match the source files, that is, tests for the buzzmobile/process/gps_mapper node should go in buzzmobile/tests/unit/process/test_gps_mapper.py. Integration and simulation tests should go in the buzzmobile/tests/integration and buzzmobile/tests/simulation test subdirectories respectively.

In order to use the test_util api for writing tests, please refeer to the readme in buzzmobile/buzzmobile/tests/test_utils/.

buzzmobile's People

Contributors

chsahit avatar coletaylor788 avatar dacohen avatar ebarnette avatar ebretl avatar irapha avatar joshuamorton avatar kpberry avatar ruyimarone avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

buzzmobile's Issues

Make deployment similar on all platforms

Following some discussion in #61, it might be better if we factored out parts of the travis ci build script into a install.sh, so that

git clone
./install.sh

would install system dependencies, run setup.py, install rosdeps, etc. and allow deployment to be done the same way across all systems.

Rossify gps_mapper

Takes polyline and saves it, outputs a gps frame at every NavFix received

Prune repo of unused things

Such as two of the image_const nodes (we only need one, and it should arguably live in tools).
Or even the lane_detector, which we are scraping for the future road_detector node aka agencynet

Properly fix battery

It disconnected again today, and I'm worried I might not be connecting it properly, so I'd like to get someone else fix it.

Node namings

sense/inputer/repl.py -> sense/inputer/inputer.py
lidar_node should be lidar_to_frame

Fix Hardware eStop

The Arduino should know if the car is e-Stopped. Right now something is disconnected. Debug in conjunction with RJ if necessary.

Fix if statement for estop in Arduino code

if(digitalRead(estop_pin) == HIGH) is never evaluating to true, even when the estops are disabled. Since this never evaluates to true, the car never moves. Commenting this if statement allows the car to run but it was likely important and should be fixed. Here is the comment from the code:

/* Only run the controllers if the motors are enabled.

  • This prevents the controllers ramping up while the
  • vehicle is estopped, creating a dangerous situation.
    */

Add 1-way coupling on rear-axle

The chain makes clicking noises occasionally as driving straight and clicks very quickly when turning sharply. It doesn't really sound right.

Create route_mapper

Should take map_querier's polyline and the current NavFix and output a route_map image gotten from google maps' static api (example code in buzzmobile/tools/mapper/mapper.py)

Also note: only query every 8 secs, and every time new polyline is published, so we dont run into our daily limit

Consider removing horn_node

Currently, we have no horn. But with the assumption that we will get one (this is pending discussion):

Shouldn't car_interface also process CarPose for handling turning horn on and off? Why do we need a separate node and separate message simply to say "horn is on/off", when the CarPose will already encode this information? (and if it doesn't currently encode it, it should shouldn't it?)

Get Google Maps key Working

Add google maps key and keep it private so other people don't use it.

Also it currently exists in the git history, so we may want to go ahead and expunge it from history as well.

Move nodes to different folders

buzzmobile/mission_control/route_mapper.py -> buzzmobile/route_mapper/route_mapper.py

@coletaylor788 I want your opinion on this:
buzzmobile/sense/controller.py -> buzzmobile/plan/controller.py (or process/)
Mainly because controller outputs a CarPose, so it does the planning

steering and lidar_to_frame OOM

If you run image_const gps_model, a lidar bag, lidar_to_frame, frame_merger, steering, some of get Killed because they go OOM sometimes. We need to investigate why.

Formalize hardware features of car

This includes...

  • measuring wheel_base
  • re-measuring wheel_circumference (the rosparam we have was taken from buzzmobile-old)
  • re-measuring max_steering_angle
  • considering changes to the way we compute travel_distance

About the last point, right now, travel_distance is a value that is a "good enough" approximation of the distance travelled per tick. In buzzmobile-old, this was computed by the odometry interface, which assumed that the car was moving at a constant speed, and published:

distance_travelled = (ticks_elapsed_since_last_keep_alive * wheel_circumference) / ticks_per_revolution

We don't have odometry interface anymore, and we don't need it, because we're not simulating the car in the world model to make decisions about what to do when in a future state. So, to us, travel_distance can be an approximation because all it does it codify the granularity of the tentacle points. But I wanted to make sure this was mentioned somewhere ack'd

https://github.com/gtagency/buzzmobile/pull/32/files#diff-803bc1ddda718a341d80de27f0125f74R14

Use dynamic breaking_distance

steering.py should not be using a rosparam breaking_distance, it should calculate a breaking distance based on the current speed and other features of, say, the motor.

This will probably be something to figure out when we have a real-world odometry interface. See #41

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.