Comments (4)
Hi @7DoF ,
Did you verify your transformations between LiDAR and IMU? Both start at zero as this is the initialization value, but this may be a reason for the divergence.
from kr_autonomous_flight.
Hi @fcladera
about the extrinsic b/w Lidar and IMU:
the output fed into UKF is in IMU frame and the output of UKF is also in the same frame.
I have tried changing this extrinsic in the LIO code to see if the offset increases/decreases. but there is no diff in the offset.
(no offset in y, tiny offset in x, and about 15cm of offset in z]
I have been trying different things to understand the issue:
- I ran ukf code in SITL setup with ground truth IMU data [noise free] and ground truth LIO Odometry data.
- this is the graph and the scale of offset:
- as you can see the scale of offset is around 0.001m or .1cm here which is acceptable [ in z direction]
- another thing is, the scale of offset in x and y direction is around 0.001cm
- somehow this issue is more seen in z direction
- another test that i did is:
- i started the UKF filter after takeoff
- my doubt is, that the IMU during takeoff is getting over exited somehow and because of this the UKF value is always more than the LIO odometry.
- when i start the UKF after takeoff, i notice that the offset are drastically reduced:
- here is an image: you can see that this time the offsets are very low.
can this be a filter thing that i can fix or this is just a raw sensor data issue? as in, the sensor outputs are offsetted during takeoff or during any jerky situation ?
thanks and regards
from kr_autonomous_flight.
What setup are you using? What is your IMU/LiDAR?
While I am not sure this may be related, vibration noise has always been a problem for us. We try to damp our sensors using these foam pads . You can also give it a try to this.
from kr_autonomous_flight.
Hi @fcladera, sorry for the late reply.
I've an ouster lidar and an external IMU.
I tried robot localization yesterday and these are the results I got:
The UKF here is following the input odometry really well throughout the flight. I'm getting a max offset of 1cm b/w UKF's propagated state and input odometry.
here is another result for drone takeoff type scenario:
As seen, there are no offsets that are getting added up here during takeoff.
This is the launch file i use for robot_localization UKF:
<launch>
<node pkg="tf2_ros" type="static_transform_publisher" name="static_transform_publisher1" args="0 0 0 0 0 0 base_link camera_init" />
<node pkg="tf2_ros" type="static_transform_publisher" name="static_transform_publisher2" args="0 0 0 0 0 0 camera_init map" />
<node pkg="tf2_ros" type="static_transform_publisher" name="static_transform_publisher3" args="0 0 0 0 0 0 odom camera_init" />
<node pkg="robot_localization" type="ukf_localization_node" name="ukf_localization" clear_params="true">
<param name="frequency" value="30"/>
<param name="sensor_timeout" value="0.1"/>
<param name="two_d_mode" value="false"/>
<param name="map_frame" value="map"/>
<param name="odom_frame" value="camera_init"/>
<param name="base_link_frame" value="base_link"/>
<!-- <param name="world_frame" value="world"/> -->
<param name="odom0" value="Odometry"/>
<param name="imu0" value="/imu0"/>
<rosparam param="odom0_config">[true, true, true,
true, true, true,
true, true, true,
false, false, false,
false, false, false]</rosparam>
<rosparam param= "imu0_config"> [false, false, false,
true, true, true,
false, false, false,
true, true, true,
true, true, true]</rosparam>
<param name="odom0_differential" value="false"/>
<param name="imu0_differential" value="false"/>
<param name="imu0_remove_gravitational_acceleration" value="true"/>
<param name="print_diagnostics" value="false"/>
<!-- ======== ADVANCED PARAMETERS ======== -->
<param name="odom0_queue_size" value="200"/>
<param name="imu0_queue_size" value="1000"/>
<param name="debug" value="false"/>
<param name="debug_out_file" value="/home/tmoore/Desktop/debug_ekf_localization.txt"/>
<rosparam param="process_noise_covariance">[0.05, 0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0,
0.0, 0.05, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0,
0.0, 0.0, 0.06, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0,
0.0, 0.0, 0.0, 0.03, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0,
0.0, 0.0, 0.0, 0.0, 0.03, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0,
0.0, 0.0, 0.0, 0.0, 0.0, 0.06, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0,
0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.025, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0,
0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.025, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0,
0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.04, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0,
0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.01, 0.0, 0.0, 0.0, 0.0, 0.0,
0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.01, 0.0, 0.0, 0.0, 0.0,
0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.02, 0.0, 0.0, 0.0,
0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.01, 0.0, 0.0,
0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.01, 0.0,
0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.015]</rosparam>
<rosparam param="initial_estimate_covariance">[1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0.1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9]</rosparam>
</node>
</launch>
I still haven't gone through the code base for robot localization. but my understanding is state propagation in FLAUKF is left incomplete maybe? in the sense that the covariance scales are mismatched and because of this propagation is only happening in an an obvious sort of manner. (I mean the tiny corrections which maybe in the order of a 1cm or 2cm that have to be appended to the state are hard to estimate)
the thing is I've got most offset issues in Z direction, which i think is one of the hardest state to maintain because the drone has to counter gravity and then maintain its state.
Anyway, this is just my understanding. Let me know what you think about this.
Thanks
from kr_autonomous_flight.
Related Issues (20)
- Prevent the quad to fly outside the z-boundary of the map when planning in 3D
- Fix A star error in JPS3D/graph_search.cpp HOT 5
- GPS integration in kr_autonomous_flight HOT 1
- Move global and storage maps along with robot HOT 1
- What's wrong with this: "Priority queue is empty!!!!!!" HOT 13
- ERROR message "[Traj Tracker:] Max position error ..." HOT 2
- Reliable integration of LIDAR odometry into autonomy stack HOT 8
- Reliable integration of GPS into autonomy stack HOT 1
- Write mapper as a ROS service
- Make the voxel map's default values for occupied, unknown and free voxels as parameters in VoxelMap.msg HOT 12
- Make local voxel map z-axis center aligned with robot odometry z HOT 11
- Merge/move mavros_interface to kr_mav_control/interface/kr_mavros_interface HOT 2
- [quadrotor/rqt_gui_buttons-22] process has died HOT 4
- gazebo simulation waypoints in different z axis HOT 2
- how to replace lidar with realsense depth camera HOT 3
- Imu quality HOT 2
- How to tune PID Control Gains?
- Missing header file
- protobuf issue - Errors << ouster_gazebo_plugins:check /home/minasm/catkin_ws/logs/ouster_gazebo_plugins/build.check.010.log HOT 11
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from kr_autonomous_flight.