Giter VIP home page Giter VIP logo

Comments (11)

youliangtan avatar youliangtan commented on August 26, 2024

It seems that only one marker is detected when it is trying to enter the "steer_dock" state, that's why the action failed?

Can you check with the /fiducial_images image topic viz if the markers are still observable py the aruco_detect. We have encountered that tag detection is prone to the lighting condition of the local environment. Also, it seems that the robot isnt going into "parralel correction" state, you can also try to reduce the threshold of max_parallel_offset param.

from autodock.

AmelFaidi avatar AmelFaidi commented on August 26, 2024

Hi youliangtan,
thanks for your response
at the first time the robot is parallel to the 3 tags and it can detect all of them. I don't understand why it tries to turn left instead of going straight to the charging station. when it tries to turn left it loses the detection of the tags so it can't go to the station.
actually max_parallel_offset: 0.16.

from autodock.

youliangtan avatar youliangtan commented on August 26, 2024

In the predock state, the current logic is that the robot will align itself vertically to the detected side markers. This realignment action will happen even if only 1 marker is detected.

I think there's some misinformation in the printout of this. This line get printed even if a marker is detecting 1 marker (do_single_side_marker_rotate()). So I believe that from the log that u have provided here, the robot is not able to detect fiducial_10. Do check the image view of the robot. All in all, the robot needs to detect both sides of the marker only to transition successfully to the next "steer_dock" state.

from autodock.

AmelFaidi avatar AmelFaidi commented on August 26, 2024

at first time the robot detected the 2 markers ( i verified this with image view )so it started to rotate in its place then it could detect just the marker fiducial_10 then it followed the rotation until no marker was detected.
which parameters should i verify to ensure that the robot pass to steer_dock state.
i reduced max_parallel_offset as you mentioned but there is no result .

from autodock.

AmelFaidi avatar AmelFaidi commented on August 26, 2024

Hi youliangtan
I resolved the problem of tags but the robot stop at 1m from fiducials tags and publish that he arrive to the docking station.

from autodock.

youliangtan avatar youliangtan commented on August 26, 2024

Nice that you resovled the tags problem. It will be nice if you can share what changes you made to solve this.

Related to the robot stopping 1m from the tags, this is because the robot lost detection of the tags, and transitioned to "last_mile" state. Thus, try to adjust this value, which let the robot relies on pure odometry to move towards the charger.

from autodock.

AmelFaidi avatar AmelFaidi commented on August 26, 2024

i changed these values in turtlebot3.yaml
stop_yaw_diff: 3 # radian
max_parallel_offset: 0.05

from autodock.

AmelFaidi avatar AmelFaidi commented on August 26, 2024

to_last_mile_dis: 0.3 # edge2edge distance where transition to LM
to_last_mile_tol: 0.1 # transition tolerance from SD to LM
these parameters does not effect to the stopping distance from tags?

from autodock.

youliangtan avatar youliangtan commented on August 26, 2024

to_last_mile_dis and to_last_mile_tol that you mentioned are the condition which the state transition from "steer_dock" to "last_mile". Changing these 2 params help to ensure that the robot will move with odometry during the very last mile of docking (without relying of visual markers).

from autodock.

AmelFaidi avatar AmelFaidi commented on August 26, 2024

@youliangtan the same thing the robot stay 0.6 m (from tags when i try to set these value)
max_last_mile_odom: 0.05 #0.2
stop_distance: 0.08 #0.08 # edge2edge distance to stop from charger
when i try to increase the value max_last_mile_odom to 0.25 the robot still go forward until he losed the connection with tags and print IDEL.

from autodock.

youliangtan avatar youliangtan commented on August 26, 2024

I think you might need to increase the value of max_last_mile_odom. This is the value which the robot will move based on odometry, after losing the visual marker detection. Unfortunately, this pkg requires some tuning to make it work on different robots. Hopefully, you can provide me with a simple example for me to replicate this issue. If this is indeed an issue, a fix might be needed.

from autodock.

Related Issues (16)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.