iscumd / mammoth Goto Github PK
View Code? Open in Web Editor NEWMammoth?
Home Page: http://www.iscumd.com/
License: MIT License
Mammoth?
Home Page: http://www.iscumd.com/
License: MIT License
The LiDAR picks up snow in the air as objects. The costmap is populated with lots of noise due to this.
Some suggested solutions may be:
This may be a MoveBase feature: maybe look in to replacing MoveBase with a different motion planner. There are a couple issues we face with configuration of MoveBase, and some incompatibilities with gmapping and the NavStack. Previous years we used a Pure Pursuit path tracker. The package is listed https://github.com/iscumd/path_tracking.
Some of the behaviors that need tuning would be:
Line 18 in e5326b2
This needs a version: ros2
, else it breaks the build.
So great news, I didn't fix the urdf issue. Well, in sim I did. But on the physical robot, forwards and backwards are fine, but the left and right are switched. To fix this I put a bandaid fix on it by not rotating the laser_link joint in the urdf file. While I could stop here but this isn't the recommended setup of the ouster os1.
After comp, I need to fix the root problem by orienting every single joint on the robot in the same direction meaning base_link, base_footprint, wheel_links, laser_link, and imu_link have their x component parallel to each other.
In foxy, the Ouster driver was able to be added into the nav2 lifecycle manager without an issue. but with migrating onto galactic and above, Ros2 Nav2 introduced a bond server that requires nodes in the life cycle to connect to and requires a heartbeat within a time period before it transitions to the next node to bring up. The issue with this is, the Ros2 Ouster does not have a bond server implemented causing nav2 to tear down all the nodes that are being managed by lifecycle.
I have tried, turning off the bond server which allows nav2 to bring up all nodes, but this causes the Ouster driver to not work properly.
I have also tried changing the timeout to about 20 seconds and while this works but after 20 seconds, nav2 tears down the entire navigation system.
Triple I world for virtual competition in 2021. New snow generation tool to go along with.
Launch yeti in simulation or in real life and attempt to send goals to her
Yeti is to drive to the goal received
if the goal is in front of her, she'll drive backwards and vise versa
on closer inspection, base_link, base_footprint, laser_link, and imu_link are in the right orientation but the wheel links
are reversed on the URDF model
In the mammoth_gazebo gazebo launch file were hard coding the path for gazebo's clock remapping based on the name of the world as defined in the sdf file:
'/world/test/clock@rosgraph_msgs/msg/Clock[ignition.msgs.Clock'
and
('/world/test/clock', '/clock'),
If the world name is changed by, for example, switching the world files to something with differing header info then the clock wont be set properly and you'll get a bunch of weird issues such as joint states not publishing and point cloud points persisting in rviz. This should probably be set to something like "mammoth_world" and all worlds created should use that name in their header information.
BUILD MY ROBOT
launch mammoth_snowplow on yeti.
The lidar sends UDP packets to the router and the router sends them to the intel nuc, and the transformations produce no errors
The UDP packets never reach the computer and when they do, the ouster driver complains that the parent cannot have the same name as its child transformation.
During testing, the current private IP addresses set in the file won't work so they had to be changed to the 192.168.x.x range
The other thing is the frames set in the file upset TF so sensor_frame
is kept to laser_link
and laser_frame
can be changed to anything but I chose os1_link
Roboteq controller is supposed to have no USB ports being passed in from ros parameters
its being passed /dev/ttyUSB1
IMO, I can honestly just remove this parameter completely and update the roboteq node accordingly
Launch mammoth_snowplow on the physical yeti, drive her around the IAVS and then view the global/local costmaps
The global and local costmaps are supposed to update according to the environment. meaning if a object moves on the map, the costmap will reflect that change.
The global costmap doesn't update fast enough so after driving yeti around for a bit, she views the entire global cost map as an object and gets very very confused and very very violent(poor cone didn't deserve that) when given a point to navigate to.
upping the local and global costmap servers to a higher hertz seemed to help out alot.
Due to mammoths large system, things need to be brought up in order for it to correctly work, and if it doesn't, things don't work. Thankfully nav2 has a lifecycle manager to allow us to manage the entire system from start to finish.
Right now, nav2 manages itself and the ouster driver. This causes some instability depending on when slam toolbox is brought up or the realsense is brought up for Odom. So after comp, I should plan to add the realsense driver to the lifecycle manager as well as slam toolbox
The simulation branch is a mess of packages, configuration, and code from 3 years worth of just getting Gazebo to work. It should be cleaned up and merged to master as a separate meta-package such as "mammoth_simulation" so that users can have both simulation and robot code all in one place for development.
This is a bit of a trade off. It is nice to have separate branches so that the two use cases (simulation and robot) are separated. Instead of making a package of it's own for simulating Mammoth as other projects may do, I suggest we just have two meta-packages the way the simulation branch does now. The thing that needs to be changed is refactoring packages and files so that its clear what each part is doing. I suggest merging the two branches because they no longer effectively separate simulation and robot deployment. Simulation branch can also deploy the robot, and applying fixes to both branches simultaneously when bugs are found sounds annoying.
Some things that should be done as a part of this clean up is:
At the moment there is no drive mode control for the ROS2 code. In the ROS1 version, we used the robot state controller to mediate whether the controller or the navigation system was driving the robot.
For some reason, The Behavior Tree within Nav2 for Mammoth Snowplow is referencing a change goal node when it shouldn't be. What makes this error even more weird is Mammoth Gazebo and Mammoth Snowplow share the exact same nav2 configuration file but Mammoth Gazebo navigates without error.
Launch mammoth_snowplow on the intel nuc
The Nav2 stack is supposed to launch and allow the lifecycle manager to cycle through each nodes states
The nav2 stack launches but fails on bring up and gives an error saying its missing a behavior tree plugin that is only available in Ros2 galactic?
I was able to fix this error by embedding the ros2 ouster driver launch/config files in the nav launch/config files and then adding the ouster driver to the lifecycle manager as the first node to bring up. I guess nav2 didn't like when the lidar was brought up unmanaged
We found at competition that yeti cannot be started nor controlled over ssh. This is a major inconvienience, as the software must thus be started using a monitor we litterally put on the bot. Fixing this issue would make using Yeti in competition much easier.
EDIT:
To clarify, this is really two issues
It looks like the Ros2 github action is failing because a linter is set up to err on style guidelines not being followed. We should really make a pipeline that formats our code on CI run, else the CI will always fail even if the tests pass, which Isn't very useful.
Not sure how to replicate, but at competition we had to rotate Yeti 30 degrees to the right to get it to plow straight. We thought it was because the costmap is not cleared at the beginning, and Yeti would attempt to plan a path around unknown space. This is likely a gmapping issue.
Write a node which handles stall detection and avoidance.
The raspberry pi now has an updated image for Melodic installed, we can begin testing basic driving code on it.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.