Giter VIP home page Giter VIP logo

kitti_odometry_evaluation_tool's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

kitti_odometry_evaluation_tool's Issues

ZeroDivisionError: division by zero

line 194, in computeOverallErr
ave_t_err = t_err / seq_len
ZeroDivisionError: division by zero

I debuged a littie, found the following seq_err is error, I used my own data, the poses_gt and poses_result could be read successfully.

compute sequence errors

        seq_err = self.calcSequenceErrors(poses_gt, poses_result)

Pose transformation

Thank you for your outstanding work. I want to ask a question: kitti odometry's gt_ pose is relative to cam0 at t0, and my data is as follows: Given the position (x, y, z) of the camera on the vehicle in the world coordinate system and the conversion matrix from the camera to the world coordinate system at each time, how can I calculate the pose of the camera at each time relative to the camera at t0? Looking forward to your reply

Average sequence rotation error

Thanks for your sharing. The evaluation results are really comprehensive! Here, I have some question . I have read some papers which use kitti odometry. The evaluation result they give was in type of as follows:
• t: average translational RMSE drift (%) on a length of 100–800 m.
• r: average rotational RMSE drift (°/100 m) on a length of 100–800 m.

The results you give was:
Average sequence translation RMSE (%): 6.2131
Average sequence rotation error (deg/m): 0.0170
I think the translation RMSE are under the same evaluation, however, rotation error may not. I wonder whether it's right if I directly multiply 0.0170 by 100?

the principle of calculation

您好!
我运行了您的代码,但是对于代码怎么评估算法的这个原理看不懂,看了KITTI的数据集的论文也没有,请问您可以大致讲一下吗?十分感谢!!

it dosen't work when i try the way you pose

大佬,我试了下readme里的方法,发现不行,能指教一下改什么吗,谢谢大佬
Uploading 2019-07-23 15-25-05屏幕截图.png…
python evaluation.py --result_dir=./data/ --eva_seqs=09_pred,10_pred
Traceback (most recent call last):
File "evaluation.py", line 17, in
import tools.transformations as tr
ImportError: No module named tools.transformations

numpy.linalg.LinAlgError: Singular matrix

I generate a pose.txt from my SLAM algorithm.

while this tools throw an error, while I test on another evaluation tools it works fine.

could you kindly tell me the reason?

ZZZZ@DeskPC:~/Desktop/visual_odometry/KITTI_odometry_evaluation_tool$ python3 evaluation.py --result_dir=./data/ --eva_seqs=09_pred
Traceback (most recent call last):
  File "evaluation.py", line 663, in <module>
    pose_eval.eval(toCameraCoord=args.toCameraCoord)   # set the value according to the predicted results
  File "evaluation.py", line 592, in eval
    seq_err = self.calcSequenceErrors(poses_gt, poses_result)
  File "evaluation.py", line 165, in calcSequenceErrors
    pose_error = np.dot(np.linalg.inv(pose_delta_result), pose_delta_gt)
  File "/usr/local/lib/python3.5/dist-packages/numpy/linalg/linalg.py", line 551, in inv
    ainv = _umath_linalg.inv(a, signature=signature, extobj=extobj)
  File "/usr/local/lib/python3.5/dist-packages/numpy/linalg/linalg.py", line 97, in _raise_linalgerror_singular
    raise LinAlgError("Singular matrix")
numpy.linalg.LinAlgError: Singular matrix

Wrong direction

Thank you very much for your work.
but when I use this tool for drawing the trajectory, the direction of (x,y,z) seems not correct.
07_path
07_path_3D
How can I solve this problem??
Whether I use this in the wang way??

External parameter transformation from camera to radar

When the laser SLAM algorithm is run, the odometer track obtained is in the radar coordinate system, and the ground_truth is under the camera coordinate system. During the trajectory evaluation, we can use the params --toCameraCoord True to transform the trajectory to the camera coordinate system.
But the external parameter matrix in the program is just to adjust the coordinate system
R_C2L = np.array([[0, 0, 1, 0],
[-1, 0, 0, 0],
[0, -1, 0, 0],
[0, 0, 0, 1]])
But KITTI's calibration file gives the transformation relationship from radar to camera coordinate system, Tr in calib.txt
Tr: 4.276802385584e-04 -9.999672484946e-01 -8.084491683471e-03 -1.198459927713e-02 -7.210626507497e-03 8.081198471645e-03 -9.999413164504e-01 -5.403984729748e-02 9.999738645903e-01 4.859485810390e-04 -7.206933692422e-03 -2.921968648686e-01

My question is, when I use this real external parameter, I get a larger error result, but using this default parameter, I can get a smaller error result. How can I evaluate the laser slam algorithm correctly?
The first evaluation uses the external parameters of the calibration file, and the second evaluation uses the default parameters in the program
1

Monocular Visual Odometry 6 DOF

I'm trying to evaluate KITTI odometry
first I convert pose to 6DoF using this code

def rotationMatrixToEulerAngles(self, R):
        assert (self.isRotationMatrix(R))
        sy = math.sqrt(R[0, 0] * R[0, 0] + R[1, 0] * R[1, 0])
        singular = sy < 1e-6

        if not singular:
            x = math.atan2(R[2, 1], R[2, 2])
            y = math.atan2(-R[2, 0], sy)
            z = math.atan2(R[1, 0], R[0, 0])
        else:
            x = math.atan2(-R[1, 2], R[1, 1])
            y = math.atan2(-R[2, 0], sy)
            z = 0
        return np.array([x, y, z], dtype=np.float32)

def matrix_rt(self, p):
        return np.vstack([np.reshape(p.astype(np.float32), (3, 4)), [[0., 0., 0., 1.]]])

pose1 = self.matrix_rt(self.poses[index][i])
pose2 = self.matrix_rt(self.poses[index][i + 1])
pose2wrt1 = np.dot(np.linalg.inv(pose1), pose2)
R = pose2wrt1[0:3, 0:3]
t = pose2wrt1[0:3, 3]
angles = self.rotationMatrixToEulerAngles(R)
odometries.append(np.concatenate((t, angles)))

also, model output is in the same format (6DoF)

the question is how to evaluate 6DoF results

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.