georust / geographiclib-rs Goto Github PK
View Code? Open in Web Editor NEWA port of geographiclib in Rust.
License: MIT License
A port of geographiclib in Rust.
License: MIT License
This unit test for the direct method using GeodTest data currently requires a tolerance of up to 20000 to accept results found for the S12 value. Investigate, presumably fix geodesic behavior, and adjust the S12 tolerance in this test to a more reasonable level.
Also, confirm that this addresses S12 behavior in test_geodtest_geodesic_direct21, and adjust its tolerance. I expect the same bug(s) affect both tests.
This standard Karney unit test currently fails. Fix geodesic to make it work.
This test is another prolate spheroid case, so can probably be a low-ish priority relative to some other test failures.
This standard Karney unit test currently fails. Fix geodesic to make it work.
It tests arc-distance for a case with large-ish flattening (0.1).
This inverse test against GeodTest data currently requires tolerance of 0.02 for azi1 and azi2, and 0.00005 for m12, in order to pass all cases. The azimuth tolerance values in particular are larger than I would expect to be required, and the m12 value is large enough to at least give a quick look.
This is a lower priority than some of the more clear-cut test failures.
This standard Karney unit test currently fails. Fix geodesic to make it work.
Tests longitude unrolling with inverse calculation. It also happens to use a large-ish flattening (0.1), though the test description doesn't highlight that.
Declare "z" as const (renaming to "Z" to keep name idiomatic).
Move check for zero-value "x" earlier, to avoid calculating values that will be discarded.
Putting it together might look something like...
pub fn ang_round(x: f64) -> f64 {
// The makes the smallest gap in x = 1/16 - nextafter(1/16, 0) = 1/2^57
// for reals = 0.7 pm on the earth if x is an angle in degrees. (This
// is about 1000 times more resolution than we get with angles around 90
// degrees.) We use this to avoid having to deal with near singular
// cases when x is non-zero but tiny (e.g., 1.0e-200).
const Z: f64 = 1.0 / 16.0;
if x == 0.0 {
0.0
} else {
let mut y = x.abs();
// The compiler mustn't "simplify" z - (z - y) to y
if y < Z {
y = Z - (Z - y);
}
if x < 0.0 {
-y
} else {
y
}
}
}
This direct test against GeodTest data requires tolerances for lon2 and azi2 that are larger than I would expect (4e-6), but not shockingly so. Investigate and see if it can be improved.
Lower priority than some of the more clear-cut test failures.
Would it be possible to release a 0.2.2 version? My PR #45 has been merged quite some time ago. And I think it introduces some API changes (namely more derive macros) that could be useful for users.
This inverse test using GeodTest data currently requires an S12 tolerance of 30 billion to pass call cases. Fix geodesic behavior to return more reasonable values, then adjust the tolerance used by this test.
Some changes to consider for geomath::sincosd, starting with the ones that seem most solid.
Calculate "s" and "c" using sin_cos rather than sin and cos: let (s, c) = r.sin_cos();
It should be possible to declare "q" non-mut, and skip the later changes to it, because the initial calculation should already result in the desired value range.
Use native "%" operator instead of fmod.
Consider using "match" rather than "if/else if" when checking "q":
let (s, c) = match (q as usize) & 3usize {
0usize => { (s, c) },
1usize => { (c, -s) },
2usize => { (-s, -c) },
_ => { (-c, s) } // 3
};
Contrary to what I said in #3, this is a place where I failed to find a clean way around shadowing or an equivalent. It should be possible to remove it if and when support for destructuring assignment is added.
I also can't remember whether I confirmed that all my casts to usize are needed, or whether that's just paranoia on my part.
Remove the second round of shadowing for the "s" and "c" variables.
Consider replacing "90.0 * q as f64" with "(90 * q) as f64" to favor integer math where practical.
Review any possible difference between "(r / 90.0 + (0.5)).floor()" and "(r / 90.0).round()". I think that both are available in C++, and Charles Karney chose to use floor over round. Review should make sure to include both positive and negative boundary cases (like 0.5 and -0.5). Behavior would be my main concern, but also performance.
Review handling of NaN and infinite inputs, to make sure behavior is consistent with other versions of geographiclib.
Consider adding a comment about future use of remquo. The C++ version uses this when available, so I expect it allows some performance improvement. The nagisa crate currently has this on a todo list.
Putting it all together, it might look something like this...
pub fn sincosd(x: f64) -> (f64, f64) {
// In order to minimize round-off errors, this function exactly reduces
// the argument to the range [-45, 45] before converting it to radians.
// Consider using remquo here, if and when it becomes available. See C++ version for details. --SL, 2020/07/17
let mut r = if x.is_finite() {
x % 360.0
} else {
f64::NAN
};
let q = if r.is_nan() {
0
} else {
(r / 90.0).round() as isize
};
r -= (90 * q) as f64;
r = r.to_radians();
let (s, c) = r.sin_cos();
// Modify to avoid shadowing, if and when destructuring assignment becomes available. --SL, 2020/07/17
let (mut s, mut c) = match (q as usize) & 3usize {
0usize => { (s, c) },
1usize => { (c, -s) },
2usize => { (-s, -c) },
_ => { (-c, s) } // 3
};
// # Remove the minus sign on -0.0 except for sin(-0.0).
// # See also Math::sincosd in the C++ library.
// # AngNormalize has a similar fix.
// s, c = (x, c) if x == 0 else (0.0+s, 0.0+c)
// return s, c
if x != 0.0 {
s += 0.0;
c += 0.0;
}
(s, c)
}
geomath::get_min_val() calculates a value that is already available in rust as f64::MIN_POSITIVE. It would be preferable to use the existing constant rather than defining a function.
This standard Karney unit test currently fails. Fix geodesic to make it work.
Tests the "distance from Wellington and Salamanca (a classic failure of Vincenty)", using WGS84.
This standard Karney unit test currently fails. Troubleshoot geodesic to get it working.
Most versions of geographiclib (C, C++, Java, JavaScript, Python) use a set of standard unit tests for validation. Search geographiclib code for "GeodSolve0" for examples. I propose adding these tests to the Rust version as well. Total length is roughly 450 lines.
A few changes to consider for geomath::polyval.
Consider removing the "s" argument, instead have the calling function slice the "p" argument any time "s" might be non-zero. I think this is more idiomatic rust, and allows simplifying polyval's loop. That in turn allows for dropping the shadowing of the "n" argument (see #3) without needing to even declare "n" as a mut argument. polyval's definition then becomes a more streamlined...
pub fn polyval(n: i64, p: &[f64], x: f64) -> f64 {
let mut y = if n < 0 { 0.0 } else { p[0] };
for i in 1..=n {
y = y * x + p[i as usize];
}
y
}
Calls to polyval would need to be modified to pass a slice starting at index "s", rather than starting at 0 and passing "s".
Example 1 (called with fixed "s" of 0):
Before: polyval(m, &COEFF, 0, sq(eps))
After: polyval(m, &COEFF, sq(eps))
Example 2 (called with variable "s" value):
Before: geomath::polyval(m, &COEFF_A3, o as usize, _n)
After: geomath::polyval(m, &COEFF_A3[o as usize..], _n)
(Note that calling contexts like this would be further improved by declaring "o" as usize in the first place, to avoid the need to cast. This change should be straightforward in most cases.)
One note of caution on this change is that C++ also supports similar array offset mechanics, yet Charles Karney chose not to use them in the original C++. I suspect this is mostly a matter of style, but it could hint that there's a slight performance benefit to passing the "s" index, in which case that difference is likely (but not certain) to carry over into rust.
Depending on your change tolerance, I would further consider changing polyval's "n" argument from i64 to isize. The difference is academic for 64 bit platforms, but for values that are primarily used as an index into arrays, it's generally preferable to stick with the architecture's native size. This would involve a change at each calling context. While the changes should be fairly simple, the benefit is also modest.
lazy_static
is used once in the whole crate - to declare wgs84
ellipsoid. So lazy_static
is not necessary for the main functionality of the crate, but is introduced as a compulsory dependency.
Having a reference ellipsoid in the crate is definitely a useful feature to not require from user to define WGS84 manually every time (and it's also useful for tests). But the introduction of a additional dependency for one object seems like a good spot for improvements.
My idea is to add pre-defined ellipsoids (and thus lazy_static
) as a default feature. This way no functionality will change for users, but they will be able to use one less dependency when needed.
I can create a PR if such solution is welcome.
In rust, if you declare a variable with some name, and then later declare a variable with the same name in the same scope, it creates a second variable that hides or "shadows" the first one. See https://qvault.io/2020/05/13/variable-shadowing-in-rust-let-is-immutable-but-not-constant/ for one discussion of this topic.
While this technique can be useful, I don't believe it's desirable in any of the current uses in geomath. I propose removing all instances of this with equivalent mutable variables or arguments. Similar cases likely exist in other files, but I'm limiting this issue to just the geomath file for now, since it's the only one I've reviewed in depth.
Example 1 (shadowed variable):
// Current
pub fn sum(u: f64, v: f64) -> (f64, f64) {
//...
let vpp = s - up;
//...
let vpp = vpp - v; // Stony: shadows original variable vpp, rather than modifying it
//...
}
// Proposed
pub fn sum(u: f64, v: f64) -> (f64, f64) {
//...
let mut vpp = s - up;
//...
vpp -= v;
//...
}
Example 2 (shadowed argument):
// Current
pub fn polyval(n: i64, p: &[f64], s: usize, x: f64) -> f64 {
let mut s = s; // Stony: shadows argument s, creating an additional variable
let mut n = n; // Stony: shadows argument n, creating an additional variable
//...
}
//Proposed
pub fn polyval(mut n: i64, p: &[f64], mut s: usize, x: f64) -> f64 {
// (remove redeclarations, not replacing with anything)
//...
}
Note that I will later propose removing polyval's "s" argument entirely, in a separate issue.
Hello,
I'm getting an compilation error when implementing the first example as mentioned in https://crates.io/crates/geographiclib-rs
Compiling geographiclib-rs v0.2.3
error[E0195]: lifetime parameters or bounds on type `Target` do not match the trait declaration
--> /Users/d029158/.cargo/registry/src/index.crates.io-6f17d22bba15001f/geographiclib-rs-0.2.3/src/geodesic.rs:45:1
|
45 | / lazy_static! {
46 | | static ref WGS84_GEOD: Geodesic = Geodesic::new(WGS84_A, WGS84_F);
47 | | }
| |_^ lifetimes do not match type in trait
|
= note: this error originates in the macro `__lazy_static_internal` which comes from the expansion of the macro `lazy_static` (in Nightly builds, run with -Z macro-backtrace for more info)
I have got the same error when compiling a program which uses surrealdb as surrealdb has a dependency to geographiclib-rs v0.2.3.
I'm using a MacBook with the following processor: 2,3 GHz 8-Core Intel Core i9.
Do you have an idea what the problem is?
Kind regards,
Stephan
Currently, geomath::norm passes its 2 arguments by value and returns a size 2 tuple. Every caller then turns around and assigns the returned values back to the inputs or variables that shadow the inputs. Both the C++ and Java versions use in-out arguments to achieve this affect more smoothly. I think the Python doesn't because it's more difficult to achieve that in-out effect with a non-object type in Python.
I think we should change the function to use in-out arguments (&mut f64), like the C++ and Java versions. My main reason for this is that it will make all my subsequent code comparisons easier by removing one type of possible copy/paste or typo error in this often-used function, where a result value could accidentally be assigned to the wrong variable. Secondarily, it makes the code cleaner, and probably just a little bit faster (though probably not enough to be reliably noticeable in benches).
This standard Karney unit test currently fails. Fix geodesic behavior to make it work.
This test covers an "antipodal prolate" case, which is to say points on opposite sides of a spheroid shaped like an egg (elongated rather than squashed). My sense is that this is likely to be a pretty unusual case in practice, since no planetary body is likely to have this shape, so this can probably be a lower priority than some other test failures.
Use mutable arguments rather than shadowing.
Use std::mem::swap to swap "x" and "y".
Integer "q" instead of f64.
"match" instead of "if/else-if" when checking "q".
Borrow comments from c++ counterpart.
Putting this all together, it might look something like...
pub fn atan2d(mut y: f64, mut x: f64) -> f64 {
// In order to minimize round-off errors, this function rearranges the
// arguments so that result of atan2 is in the range [-pi/4, pi/4] before
// converting it to degrees and mapping the result to the correct
// quadrant.
let mut q = if y.abs() > x.abs() {
std::mem::swap(&mut x, &mut y);
2
} else {
0
};
if x < 0.0 {
q += 1;
x = -x;
}
let ang = match q {
// Note that atan2d(-0.0, 1.0) will return -0. However, we expect that
// atan2d will not be called with y = -0. If need be, include
//
// case 0: ang = 0 + ang; break;
//
// and handle mpfr as in AngRound.
1 => (if y >= 0.0 { 180.0 } else { -180.0 }) - y.atan2(x).to_degrees(),
2 => 90.0 - y.atan2(x).to_degrees(),
3 => -90.0 + y.atan2(x).to_degrees(),
_ => y.atan2(x).to_degrees()
};
ang
}
The current code has moved several functions that refer to this constant out of geodesic and into geomath, where the constant is not available. Instead, it is passed as an argument. I think it would be preferable to allow it to be referenced directly as a constant, rather than passing as an argument. This is partly a style choice, but should also yield a slight performance gain.
This could be done by either moving the constant from geodesic to geomath, or by moving the consuming functions from geomath back to geodesic. The choice of which approach to use might depend in part on what motivated the decision to move those functions to geomath in the first place. All other things being equal, it seems preferable to define the functions in geodesic, for symmetry with sibling versions of geographiclib.
This standard Karney unit test currently fails. Fix geodesic to make it work.
"An example where the NGS calculator fails to converge", using inverse on WGS84.
Many functions have stub doc comments (///), but more detail would be preferable. The C++ code typically includes expansive comments and could be a good source to start from.
I have not yet explored annotation syntax for rust doc comments, so I'm unsure how much conversion would be needed when copying doc comments from c++ or other geographic lib counterparts.
geomath::get_epsilon is calculating a value that rust already has available as the constant f64::EPSILON. It would be preferable to reference that constant directly, rather than defining a function for it.
Some avenues to consider, in roughly chronological order of observation, include:
M12 = sig12.cos();
M21 = sig12.cos(); // instead use M21 = M12;
|=
.None is likely to make much impact individually, but they might add up to something together.
I'll continue adding items to this list as I explore issues with returned values, until we get around to actually trying to make performance improvements.
I propose replacing all calls to geomath::cbrt with f64::cbrt, and then deleting the geomath::cbrt function. I don't think the additional function adds any value.
There are already comments in code proposing this change, but I figured it was still worth submitting an issue for it.
Suggested by @michaelkirk, along with a few starting thoughts...
Try to integregrate @michaelkirk 's validation tool (https://github.com/michaelkirk/validate_geographiclib) into https://github.com/georust/geographiclib-rs CI.
There are lots of different approaches to performance work, and Iโm not an expert but the tools I found most helpful were:
- https://crates.io/crates/criterion for performance regression testing and A/B testing
- https://github.com/flamegraph-rs/flamegraph for profiling
Further thoughts from @stonylohr...
While @michaelkirk didn't specify, I'm currently interpreting his suggestion to include both validation testing and performance testing.
I don't currently interpret the suggestion to specifically endorse unit tests, integration tests, or a mix of the two (or something else) for validation testing of this sort. I read his use of the term "integrate" as being a casual use rather than referring specifically to integration testing. My default plan in this regard will be to play around with things and use whatever mix seems to come together most easily.
Type usize is required for the index into an array or slice. It is generally preferable to go ahead and use this type from the outset for variables for which this is the primary purpose, to reduce the need for casting later. Unless there is a specific reason to do otherwise, of course.
Examples of places that might benefit from this change...
geodesic: GEODESIC_ORDER
geomath::_A1m1f: geodesic_order
geomath::_C1f: geodesic_order, m, o
geomath::_A2m1f: m
geomath::_C2f: o, m
This standard Karney unit test currently fails. Fix geodesic behavior to make it work.
This tests a spherical case. I initially think of this as a medium-to-low priority. NASA's factsheet for Venus lists it as spherical, and it would be unsurprising if other spherical cases do really come up in practice, but they'll tend to be much less common than oblate cases.
From #17 (comment)
I would favor also expanding the benchmarks before starting on troubleshooting the core code. This might boil down to offering the same builtin/small/full options on the rs benches that you added to the unit tests. I suggest replacing the benches' current data-copying approach with Arc+Mutex at the same time. That improved speed by about 66% in some benches I compared, though it may depend in part on the size of the test dataset.
I propose replacing all calls to geomath::fmod with equivalent direct uses of the % operator, and then deleting geomath::fmod.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.