This project was the culmination of 9 months of accelerated learning in the self-driving space. The goal of the project was to implement and integrate everything we have learned over the the last 3 terms to be tested on in a real-world scenario. The skeleton software stack for the self-driving car was provided in ROS with autoware and dbw integrations. We had to implement a planning module (waypoint updater), a control module (twist_control) and a perception module (traffic light detection).
Here we obtain the pose of the car with respect to the waypoints from the simulator or the test track. We figure out the closest waypoint to the car. The next 200 waypoints from the track are then obtained and published to the controller node. If a traffic light is detected, the waypoints upto the light will be given decelerating velocities and the waypoints following the traffic lights are given a velocity of 0. This node also decides if it is safe to stop given the distance to the trafic light based on the current velocity of the car.
The waypoints loaded into the controller module, and the waypoint loader issues twist commands based onthe twist controller. In the twist controller we have a PID controller for the throttle/brake action based on the waypoint list. The steering in controlled by a yaw controller based on the angular and linear velocity. The brake, throttle and steering command are then published to the dbw node to control the car.
This node publishes the waypoint of the closest treaffic light that is red when it is detected. The traffic light detection a optimised detector-classifer network. For the detector network we use the pre-trained SSD-Mobilenet architecture which is pretrained on the COCO dataset ( as one of the class labels in the dataset is traffic lights) followed by a low copmute and storage version of alexnet called squeezenet. The single shot detector architecture with mobilenet was chosen because of the accuracy and performance benefits as they are designed to run on low power hardware on the clients. Squeezenet was selected as it dramatically reduces the size for parameter storage and hence all the weights can fit on the RAM simultaneously. It was trained on some TL images from COCO and a few captured from the simulator.
- Vinay Kashyap (Team lead) - [email protected]
- Tom Zheng - [email protected]
- Anh Le - [email protected]
- Daedeepya Yendluri - [email protected]
- Marcos Ahuizotl Fragoso Iñiguez - [email protected]
This is the project repo for the final project of the Udacity Self-Driving Car Nanodegree: Programming a Real Self-Driving Car. For more information about the project, see the project introduction here.
Please use one of the two installation options, either native or docker installation.
-
Be sure that your workstation is running Ubuntu 16.04 Xenial Xerus or Ubuntu 14.04 Trusty Tahir. Ubuntu downloads can be found here.
-
If using a Virtual Machine to install Ubuntu, use the following configuration as minimum:
- 2 CPU
- 2 GB system memory
- 25 GB of free hard drive space
The Udacity provided virtual machine has ROS and Dataspeed DBW already installed, so you can skip the next two steps if you are using this.
-
Follow these instructions to install ROS
- ROS Kinetic if you have Ubuntu 16.04.
- ROS Indigo if you have Ubuntu 14.04.
-
- Use this option to install the SDK on a workstation that already has ROS installed: One Line SDK Install (binary)
-
Download the Udacity Simulator.
Build the docker container
docker build . -t capstone
Run the docker file
docker run -p 4567:4567 -v $PWD:/capstone -v /tmp/log:/root/.ros/ --rm -it capstone
To set up port forwarding, please refer to the instructions from term 2
- Clone the project repository
git clone https://github.com/udacity/CarND-Capstone.git
- Install python dependencies
cd CarND-Capstone
pip install -r requirements.txt
- Make and run styx
cd ros
catkin_make
source devel/setup.sh
roslaunch launch/styx.launch
- Run the simulator
- Download training bag that was recorded on the Udacity self-driving car.
- Unzip the file
unzip traffic_light_bag_file.zip
- Play the bag file
rosbag play -l traffic_light_bag_file/traffic_light_training.bag
- Launch your project in site mode
cd CarND-Capstone/ros
roslaunch launch/site.launch
- Confirm that traffic light detection works on real life images