Giter VIP home page Giter VIP logo

Comments (12)

apojomovsky avatar apojomovsky commented on August 14, 2024

This sounds expected as long as we rely on pure odometry for RViz because of slipping.
We should likely publish/consume ground truth information to prevent this from happening.
I'm curious about what's the approach followed with the real robot for RViz?

from create3_sim.

alsora avatar alsora commented on August 14, 2024

If the robot is pushing against an obstacle (let's treat it as a wall, i.e. it can't be moved) the wheels indeed will be slipping or even stuck.
However, the robot will use the mouse to compute dead reckoning estimate.

In this case, the mouse differential motion would be null, so the robot would appear as not moving.

from create3_sim.

alsora avatar alsora commented on August 14, 2024

For the sake of the simulated robot, I would say that dead reckoning pose and ground truth pose should coincide.

from create3_sim.

eborghi10 avatar eborghi10 commented on August 14, 2024

As far as I remember, the dead reckoning pose is calculated automatically by ros2_control. We could instead publish our own odom TF from ground truth data.

from create3_sim.

apojomovsky avatar apojomovsky commented on August 14, 2024

That's certainly interesting, though I kinda agree with @alsora we could simplify things for the simulator and rely on simulated ground truth.

from create3_sim.

alsora avatar alsora commented on August 14, 2024

Besides this particular problem, do you see any issue for users who want to test a SLAM system in the simulator?

By using the ground truth they would get a perfect odometry that will also never drift. This would be different from the odometry obtained from perfect sensors (i.e. no noise) which on the other hand would drift due to sampling and approximations in the integration procedure.

Maybe the best, long term solution would be to have the odom TF to be computed from mouse data rather than wheels, under the assumption that mouse always tracks correctly (i.e. I assume mouse delta is obtained from ground truth)

from create3_sim.

eborghi10 avatar eborghi10 commented on August 14, 2024

Isn't better to use robot_localization or another Kalman Filter to fuse dead reckoning, IMU, and mouse data? I don't exactly know how the real robot is performing this but it'll solve the issue and it won't add any issue for SLAM testing.

from create3_sim.

alsora avatar alsora commented on August 14, 2024

At the moment the robot is not using a Kalman filter.

from create3_sim.

apojomovsky avatar apojomovsky commented on August 14, 2024

Returning to the simulated robot discussion, my understanding is that we would like to have frame of reference in RViz that behaves better than pure odometry. I don't think there's a better option that the absolute gazebo pose published as ground truth data for this particular use case (RViz reference frame).
Estimators are great, but I suspect it should be on the robot user (developers, researchers, robot tinkerers) to implement such algorithms. Even for these applications, having ground truth data would become handy for benchmarking, etc.

from create3_sim.

eborghi10 avatar eborghi10 commented on August 14, 2024

Sounds good to me if we modify the ground truth plugin to publish the odom <-> base_link TF. But it should be reverted with a flag.

from create3_sim.

shuhaowu avatar shuhaowu commented on August 14, 2024

Should rviz be showing the "perfect" position of the robot? For a real robot it definitely doesn't as some sort of estimation is always present (SLAM, odometry, whatever).

It seems to me there are few things at play here, let me know if my assessment is accurate as I'm just getting started working on this. Sorry if I get anything wrong:

  1. The Gazebo sim_ground_truth_pose emitted by p3d which should coincide perfectly with the pose of the robot in Gazebo. This represents the "actual" pose of the robot if it is in the physical world. In a real robot, this would be provided by a system like Optitrack, so what's emitted is seems correct to me.
    • This topic is useful if the robot user want to evaluate the accuracy of SLAM/localization algorithms in a downstream project. I don't think this should be eliminated.
    • A minor note about this is that the current sim_ground_truth_pose message appears to be emitted for the transform world -> base_link, but there's nothing broadcasting this, as p3d can't broadcast tfs in ros2. This is probably a good thing to not conflict with the below...
  2. Right now it seems like /odom and the odom -> base_link transform is calculated via ros2_control, via its Gazebo plugin. For a real robot, this would be emitted either via the wheel encoders, or perhaps via the mouse (or a fusion). While I'm not quite sure how this is calculated with ros2_control and gazebo, I assume it's based on the simulated wheel turns, or some other simulation mechanism. This feels like it correctly "simulates" the real world.
    • Downstream projects might want to make this odometry value noisier, to evaluate performance of various algorithms in the presence of noisy odometry. So it would be nice to be able to perhaps rename this topic when launching the create3_nodes.launch, so downstream simulations can implement their own noise filter on the odometry information.
    • The above comments mentioned that maybe we should modify P3D to publish odom -> base_link. This would imply that the odometry of the robot
  3. Right now, there's no world/map -> odom transform published by this repo. This is probably the correct thing to do, as usually this transform is published by the localization/SLAM algorithm, which probably shouldn't be a part of this repo.

Most of the behaviour here seems actually OK to me, so is there a problem here? I might be misunderstanding something tho. One minor problem is that world frame is technically undefined due to the lack of a tf broadcast. This makes it difficult to plot the sim_ground_truth with the actual odometry value to compare them in rviz as there's no world frame (or maybe I'm doing it wrong somehow, which is always possible). EDIT: my mistake. The odometry coming from /sim_ground_truth_pose with the aws small house (which I'm using) starts at a non-zero X/Y/Z position while the position for odom obviously starts at 0, 0, 0. I just didn't zoom out far enough to see the arrow for the ground truth... 😅

from create3_sim.

shuhaowu avatar shuhaowu commented on August 14, 2024

After playing around it a bit, I do see that if I hit an object, the rviz position will continue to increase while the ground truth position doesn't increase. This also kind of make sense? The wheels would be slipping in this case so the encoders should still register the ticks. Solving this seems like it's a downstream project's problem? As it is kind of expected that /odom is going to behave like that if wheels slipped, downstream project should "fix" this by fusing/using alternative odometry sources (like the mouse sensor).

from create3_sim.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.