Comments (12)
This sounds expected as long as we rely on pure odometry for RViz because of slipping.
We should likely publish/consume ground truth information to prevent this from happening.
I'm curious about what's the approach followed with the real robot for RViz?
from create3_sim.
If the robot is pushing against an obstacle (let's treat it as a wall, i.e. it can't be moved) the wheels indeed will be slipping or even stuck.
However, the robot will use the mouse to compute dead reckoning estimate.
In this case, the mouse differential motion would be null, so the robot would appear as not moving.
from create3_sim.
For the sake of the simulated robot, I would say that dead reckoning pose and ground truth pose should coincide.
from create3_sim.
As far as I remember, the dead reckoning pose is calculated automatically by ros2_control
. We could instead publish our own odom
TF from ground truth data.
from create3_sim.
That's certainly interesting, though I kinda agree with @alsora we could simplify things for the simulator and rely on simulated ground truth.
from create3_sim.
Besides this particular problem, do you see any issue for users who want to test a SLAM system in the simulator?
By using the ground truth they would get a perfect odometry that will also never drift. This would be different from the odometry obtained from perfect sensors (i.e. no noise) which on the other hand would drift due to sampling and approximations in the integration procedure.
Maybe the best, long term solution would be to have the odom TF to be computed from mouse data rather than wheels, under the assumption that mouse always tracks correctly (i.e. I assume mouse delta is obtained from ground truth)
from create3_sim.
Isn't better to use robot_localization
or another Kalman Filter to fuse dead reckoning, IMU, and mouse data? I don't exactly know how the real robot is performing this but it'll solve the issue and it won't add any issue for SLAM testing.
from create3_sim.
At the moment the robot is not using a Kalman filter.
from create3_sim.
Returning to the simulated robot discussion, my understanding is that we would like to have frame of reference in RViz that behaves better than pure odometry. I don't think there's a better option that the absolute gazebo pose published as ground truth data for this particular use case (RViz reference frame).
Estimators are great, but I suspect it should be on the robot user (developers, researchers, robot tinkerers) to implement such algorithms. Even for these applications, having ground truth data would become handy for benchmarking, etc.
from create3_sim.
Sounds good to me if we modify the ground truth plugin to publish the odom
<-> base_link
TF. But it should be reverted with a flag.
from create3_sim.
Should rviz be showing the "perfect" position of the robot? For a real robot it definitely doesn't as some sort of estimation is always present (SLAM, odometry, whatever).
It seems to me there are few things at play here, let me know if my assessment is accurate as I'm just getting started working on this. Sorry if I get anything wrong:
- The Gazebo
sim_ground_truth_pose
emitted by p3d which should coincide perfectly with the pose of the robot in Gazebo. This represents the "actual" pose of the robot if it is in the physical world. In a real robot, this would be provided by a system like Optitrack, so what's emitted is seems correct to me.- This topic is useful if the robot user want to evaluate the accuracy of SLAM/localization algorithms in a downstream project. I don't think this should be eliminated.
- A minor note about this is that the current sim_ground_truth_pose message appears to be emitted for the transform
world -> base_link
, but there's nothing broadcasting this, as p3d can't broadcast tfs in ros2. This is probably a good thing to not conflict with the below...
- Right now it seems like
/odom
and theodom -> base_link
transform is calculated via ros2_control, via its Gazebo plugin. For a real robot, this would be emitted either via the wheel encoders, or perhaps via the mouse (or a fusion). While I'm not quite sure how this is calculated with ros2_control and gazebo, I assume it's based on the simulated wheel turns, or some other simulation mechanism. This feels like it correctly "simulates" the real world.- Downstream projects might want to make this odometry value noisier, to evaluate performance of various algorithms in the presence of noisy odometry. So it would be nice to be able to perhaps rename this topic when launching the create3_nodes.launch, so downstream simulations can implement their own noise filter on the odometry information.
- The above comments mentioned that maybe we should modify P3D to publish
odom -> base_link
. This would imply that the odometry of the robot
- Right now, there's no
world/map -> odom
transform published by this repo. This is probably the correct thing to do, as usually this transform is published by the localization/SLAM algorithm, which probably shouldn't be a part of this repo.
Most of the behaviour here seems actually OK to me, so is there a problem here? I might be misunderstanding something tho. One minor problem is that world
frame is technically undefined due to the lack of a tf broadcast. This makes it difficult to plot the sim_ground_truth with the actual odometry value to compare them in rviz as there's no world frame (or maybe I'm doing it wrong somehow, which is always possible). EDIT: my mistake. The odometry coming from /sim_ground_truth_pose
with the aws small house (which I'm using) starts at a non-zero X/Y/Z position while the position for odom obviously starts at 0, 0, 0. I just didn't zoom out far enough to see the arrow for the ground truth... 😅
from create3_sim.
After playing around it a bit, I do see that if I hit an object, the rviz position will continue to increase while the ground truth position doesn't increase. This also kind of make sense? The wheels would be slipping in this case so the encoders should still register the ticks. Solving this seems like it's a downstream project's problem? As it is kind of expected that /odom
is going to behave like that if wheels slipped, downstream project should "fix" this by fusing/using alternative odometry sources (like the mouse sensor).
from create3_sim.
Related Issues (20)
- Can't activate controller 'diffdrive_controller' with Ignition HOT 7
- Release for Humble and Rolling? HOT 24
- Sim doesn't realize goals are complete - and no data on /odom HOT 11
- Can't Build create3_sim HOT 3
- Incorrect timestamps published using Gazebo simulation HOT 1
- SEGFAULT when adding a second Create3 Robot HOT 2
- Ignition: Missing IR intensity sensors HOT 3
- GAZEBO Simulation doesn't start up well
- Humble Multi-Robot-Sim TF namespace issue HOT 4
- Sensor Issue with a namespaced multi_sim HOT 5
- Building reference errors (Create3Hmi.cc) HOT 1
- No data under dock_status and ir_opcode topics using namespace
- About how to add a LIDAR to create3 and aws launch error HOT 2
- Disable Reflexes HOT 2
- Multi-Robot Simulation: Crash when spawning more than two robots
- Remove the Ignition install option from README (Will Not Run Create3_Sim) HOT 3
- Docking action succeeds but not centered on dock HOT 1
- Ignition: missing /cliff_intensity topic
- Issue with Spawner for diffdrive_controller: Controller manager not available HOT 2
- Building project gets stuck on 'colcon build --symlink-install' command HOT 2
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from create3_sim.