Giter VIP home page Giter VIP logo

slf-dot-ch / snowmicropyn Goto Github PK

View Code? Open in Web Editor NEW
9.0 9.0 12.0 6.9 MB

A python package to read, export and post process data (*.pnt files) recorded by SnowMicroPen, a snow penetration probe for scientific applications developed at SLF.

Home Page: https://www.slf.ch/en/services-and-products/research-instruments/snowmicropen-r-smp4-version.html

License: GNU General Public License v3.0

Python 99.54% Shell 0.19% HTML 0.27%
snowmicropen snow science slf

snowmicropyn's People

Contributors

hloewe avatar m9brady avatar marscho1 avatar reisecker avatar thiemot avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

snowmicropyn's Issues

Future Warning: The pandas.np module is deprecated

Hi there,
I am not sure if I am the only one, but I get a deprecation warning when using the snowmicropyn:

"FutureWarning: The pandas.np module is deprecated and will be removed from pandas in a future version. Import numpy directly instead"

As far as I have checked the affected files are the following ones (all in the snowmicropyn dir):

  • windowing.py
  • tools.py
  • proksch2015.py
  • profile.py
  • loewe2012.py
  • detection.py

At the moment I am just surpressing the warnings, but I thought it might be helpful for all users, if this was fixed before it becomes a problem :)

Notify user of invalid pnts instead of crashing

Currently, snowmicropyn hard exits when encountering an invalid pnt file, which is especially annoying when batch loading a bunch of files.

The low level call to init a profile happens in a class constructor which currently is forced to create something and return it - traversing the calling chain upwards every module probably needs some small new logic to handle the opposite.

problem in the calculation of density and ssa

In snowmicropyn v1.0.0 the calculation of density and ssa yields strange values since the
calculation of the parameters of the underlying penetration-model was carried as originally published in (Löwe and van Herwijnen, CRST, 2012) and not as employed in (Proksch et al, JGR, 2015): The latter uses an additional de-trending of the signal prior to the calculation of the force correlation function.

Pyngui failure/crash with some *.pnt files

Hi there,

sometimes my pyngui just closes/crashes while trying to open PNT files.

That's the failure report from the terminal:

Read 24782 raw samples from file /Users/Documents/06_EQUIPMENT/SMP/test_data/S44M0295.pnt

Timestamp of profile as reported by pnt header is 2020-10-20 06:49:37+00:00

/Users/anaconda3/lib/python3.6/site-packages/snowmicropyn/loewe2012.py:58: RuntimeWarning: invalid value encountered in double_scalars

delta = -(3. / 2) * c_f[n - 1] / (c_f[n] - c_f[n - 1]) * spatial_res

Traceback (most recent call last):

File "/Users/anaconda3/lib/python3.6/site-packages/snowmicropyn/pyngui/main_window.py", line 354, in _open_triggered

self.open_pnts(files)

File "/Users/anaconda3/lib/python3.6/site-packages/snowmicropyn/pyngui/main_window.py", line 366, in open_pnts

doc.recalc_derivatives(self.preferences.window_size, self.preferences.overlap)

File "/Users/anaconda3/lib/python3.6/site-packages/snowmicropyn/pyngui/document.py", line 31, in recalc_derivatives

self._derivatives = proksch2015.calc(samples, window_size, overlap_factor)

File "/Users/anaconda3/lib/python3.6/site-packages/snowmicropyn/proksch2015.py", line 76, in calc

sn = snowmicropyn.loewe2012.calc(samples, window, overlap)

File "/Users/anaconda3/lib/python3.6/site-packages/snowmicropyn/loewe2012.py", line 89, in calc

sn = calc_step(spatial_res, chunk.force)

File "/Users/anaconda3/lib/python3.6/site-packages/snowmicropyn/loewe2012.py", line 61, in calc_step

lambda_ = (4. / 3) * (k1 ** 2) / k2 / delta  # Intensity

ZeroDivisionError: float division by zero

Abort trap: 6

Just checked in the pnt files where it happens and in the files where it does not happen. If there are more than 900 entries of the same value in the end of the file, pyngui crashes. If there are only something like 300 entries of the same value in the end of the file, than it still works.

Layer marking from the GUI

Thanks to the team of LWD Tyrol we will have manually marked (through snowmicropyn) layers for parts of the RHOSSA dataset. Since matching SMP and hand profiles is very tricky this would be an always available "last resort" to produce good training data. At the very least this would offer a closed system, if only by transferring the responsibility to the user. :-)

I think it would be nice if we facilitated this from within snowmicropyn by adding a set of visually appealing new markers for marking layers with the same grain shape.

This would require some new logic because currently everything is built around the idea that no two markers may have the same name, where in this case this would be exactly what we would want to be handled automatically (to not have to name them rg_start1, rg_end1, rg_start2, ...).

wrong units for ssa in output csv file

The values for the specific surface area which are written into the csv file
have units of 1/mm and not 1/m (=m^2/m^3) as stated in the column header.

Improve profile <-> air gap interaction

It is now possible to reasonably work with the superposition view in conjunction with hiding the air gap: A set of profiles can be loaded. If they have surface markers toggling the air gap will also work in the superposed view. If not, the marker can be added via the toolbox and everything will update automatically.

However, there are places missing this interaction, e. g. the context menu to set markers and when deleting markers. In this case the files‘ inis must be saved and reloaded.

To solve this inconvenience a bit of a convolution in the old code needs to be untangled:

# This is a bit tricky: We call the methods on main_window which

Since all code for this feature is probably going to be completely replaced in the future as issue #12 becomes more and more pressing for now this is as far as it reasonably goes - hopefully being resolved within issue #12.

Introduce „SnowPit“ layer of abstraction

Currently there is a bit of a clash between object oriented data classes and a more functional style of programming for the Physics. This is natural for GUIs controlling data output. Still, there is a bit of a backlog from the times snowmicropyn seemed to mainly have been for displaying the SMP forces.
Issues can usually be worked around because the structure is generally good, but there is some overhead, loss of clarity, and loss of elegancy (e. g. by having to put instructions in locations that feel out of place).

This gap is partly bridged by the Document class, which holds a profile and some derived values (e. g. drift fit), but it is GUI oriented and does not do all that much. In fact this is now where the AI stuff is called from, because it cannot be done from Profile or CAAML for logical reasons (attempts rightfully fail in circular includes). It works out, but in the CAAML unit test for example we have to include something from pyngui even though it has nothing to do with the GUI.

Instead of or in addition to this Document class for the GUI I propose a new class Pit which holds a) the profile and b) meta data including several derived properties (if we want full backwards compatibility we may need to just keep and extend the existing Document).

Depending on how much time is spent this could have a big reach:

  • Derivatives would have a central place to sit. Currently with the functional calls it is up to the user/developer to save this some place and keep in sync when options change. For example, the plot function calculates the values and stores them, the CAAML export calculates and stores them, and so does the normal csv output, separately.
  • This class could cache the data and recalculate only if necessary.
  • Opens up the possibility for lazy evaluation - i. e. don‘t only recalculate if necessary, but also calculate only if needed.
  • It could handle things like hiding the air gap transparently - different GUI features would not have to handle it themselves (superposition).
  • Should facilitate parallelization in the future.
  • Would fit the modules we currently have neatly: Pit would be parent of a static Profile with measurements and recorded meta data only, a derivatives object for low level parameterizations, a classifier object for machine learning, and the Pit would also be the place to house the final export where all of this is combined.
  • Probably, this class should also manage dynamic warping in the future. We could already include this in the object factories etc. and then existing code could hopefully be integrated more easily (as the framework would already exist).

Apart from the always looming refactoring in open source projects this point could be tackled separately while still having great overall ramifications. However, it is not a quick fix at all.

Just my 2 cents!

Implementing different parameterizations to choose

Hi,

I was wondering if different parameterizations could be implemented to choose in snowmicropyn. So besides Proksch2015
in addition Calonne2020 (Alpine snowpack) and King2020 (Arctic sea ice). It wouldn't be much work,
but probably a nice addition. Locally, I have in principle only used proksch2015.py as base and exchanged
the coefficients and names/references to create calonne2020.py and king2020.py.

Implement progress bar in the GUI

Partly sooner or later we will have to optimize, but partly we also just simply need some feedback from the GUI. We should have a progress bar for higher level operations and a „calculating“ icon for the lower level stuff and things we can‘t / don‘t want to interrupt.

We are doing some heavy calculations so we will always need some time, but I think just showing that we are crunching would benefit the user experience greatly.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.