Giter VIP home page Giter VIP logo

hatyan's People

Contributors

dependabot[bot] avatar evetion avatar michielcuijpers avatar veenstrajelmer avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

hatyan's Issues

Knowledge development

Develop/disseminate knowledge:

  • Xfactor research, like #245
  • Component splitting research
  • Expand component set with e.g. SSA+MSQM (SSA has no added value with predictions, t_tide uses 118 components)
  • 4 year or more analysis, and all at once (with fu_alltimes) or peryear
  • 19Y zonder knoopfactoren analyseren (correctie is dan niet nodig). Of 20yr per yr analyseren en kijken naar gem, trend en std
  • Nodal factors on each timestep is desireable in the future. However, old component files have to remain useful so automatically coupled to nodalfactors in middle of period. Add better metadata to component file to distinguish between center/alltimesteps
  • Discuss LAT implementation with MV

Prepare release 2.8.0

TODO:

Overview of user impact of finished todo's:

  • https://github.com/Deltares/hatyan/blob/main/docs/whats-new.md
  • #114
  • #116
  • #127
  • #131
  • #143
  • #175 (including support for old hatyan1 files without analysis settings)
  • #196 (this change makes it impossible to read haytan2<=2.7.0 component files since they do not contain STAT/PERD metadata lines, an error is raised)
  • #218 (prevent prediction with different settings than in components metadata and prevent non-standard nodalfactors/fu_alltimes/source in component file)
  • #223 (deprecated hatyan.write_tsdia_HWLW())
  • #229 (makes hatyan commmand available via command line directly, instead of via python -m hatyan)
  • #236 (prevented most of the prints, now to be enabled with logging module)
  • #242 (improved documentation on github)
  • #260 (removed station/vertref/tzone_hr arguments for hatyan.write_tsnetcdf(), comes from timeseries dataframes now, tzone was removed in #258)
  • #265 (dropped "ts" from all read/write timeseries function names)
  • #268 (hatyan.prediction() does not accepts settings anymore, they are derived from the components dataframe now, this prevents accidental mistakes)
  • #288 (prediction timestep now only accepts frequency string like "10min", not 10 anymore)
  • #292 (renamed CS_comps argument for hatyan.prediction() to lowercase)
  • #300 (deprecated comp_sec_list argument for hatyan.merge_componentgroups(), now use .iloc[] on comp_sec instead`)
  • overview by comparing the example script: old (2.7.0) vs. new/simplified (main)
  • improved module documentation at deltares.github.io

FEWS HWLW numbering impact:

  • removed station/vertref/tzone_hr arguments for hatyan.write_tsnetcdf(): #258 and
    #260
  • renamed hatyan.writets_netcdf() to hatyan.write_netcdf() #265

After release:

  • build method was updated, check if data was packaged in wheel (pip install in clean env)

Follow-up: #310

Incorrect low water for FEWS timeseries

calc_HWLW incorrectly computes LW instances in a FEWS timeseries for HOEKVHLD, hvh_example.nc:
image

It also incorrectly computes an additional HW instance in another FEWS timeseries for HOEKVHLD, H_Import_prediction_RWS.nc:
image

This makes it impossible to use calc_HWLWnumbering, it raises "Exception: tidal wave numbering: LW numbers not always increasing"

MWE:

import hatyan
hatyan.close('all')
import pandas as pd
import xarray as xr

# file_nc = 'hvh_example.nc'
file_nc = 'H_Import_prediction_RWS.nc'
data_xr = xr.open_dataset(file_nc)

# Extract the waterlevel timeseries for the relevant station from the xarray dataset
wl_timeseries = data_xr.isel(stations=3).waterlevel #isel3 is hoekvhld

# Then store it as a pandas dataframe and compute extremes
wl_pd = pd.DataFrame({'values':wl_timeseries},index=wl_timeseries.time)
wl_pd_ext = hatyan.calc_HWLW(wl_pd)

# For testing: plot water level timeseries with peaks identified and labeled. 
hatyan.plot_timeseries(ts=wl_pd,ts_ext=wl_pd_ext)

# Assign numbers to the extremes
wl_pd_ext = hatyan.calc_HWLWnumbering(wl_pd_ext)

Example files:
hvh_example.nc.txt
H_Import_prediction_RWS.nc.txt

Add support for WIA files

Todo:

  • #185
  • preferrably first #145
  • update to aquometadata standard? (Ddlpy column names?)
  • consider separate function for dia/wia, string type argument is not really convenient
  • wiafile contains no WNS (waarnemingsoort), is this deliberate?
  • add diawia flexible metadata and make user adjustable
  • incl NARWL naming convention
  • think about metadata xml coupling

Simplify datetimeindex for 1018

Since pandas>=2.0.0 supports non-nano timestamps: pandas-dev/pandas#52164

Relevant for year 1018 and others, raised OutOfBoundsDatetime before. Update robust_daterange_fromtimesextfreq() and robust_timedelta_sec().

This now works in pandas 2.0.0:

import pandas as pd
aa = pd.date_range("1018-01-01", "1018-01-02", freq="600min", unit="us")
print(aa)

# equality check between timestamps of unit us and ns also succeeds
print(aa[0] == pd.Timestamp("1018-01-01"))

Unit microseconds (us) it gives plenty precision and extent: https://numpy.org/doc/stable/reference/arrays.datetime.html#datetime-units

Wait on:

  • fix dfm_tools: Deltares/dfm_tools#507
  • update HMC env (was pandas 1.5.3) (mail aan JR 11-8-2023) >> hydrolib requires (required) pandas<2.0.0 so conflicts at HMC >> hydrolib will be removed

Admin:

  • update requirements.txt, pyproject.toml

Suggestions for features and changes

Based on my usage and the features some other tidal analysis software have, I think, the following will be helpful:

  1. A flag to switch off the produced output while analysis and prediction. OR preferably save the output in a log file like done in dflowfm with a .dia file.
  2. Tidal ellipse parameters as output instead of the amplitude and phase can be a nice feature. But for that, a tidal analysis should be done on u and v together. In fact, doing a tidal analysis on currents should be done with u and v together as opposed to the current u and v separately.
  3. Confidence interval output and Signal to Noise ratio output can be helpful since data with a lot of uncertainties can be flagged off with these values. See, T-Tide or U-Tide.
  4. Possibility to include a trend in the analysis. Currently, it is assumed that tides don't change over time. But would be nice if we could input a trend, say linear etc in the analysis. See https://agupubs.onlinelibrary.wiley.com/doi/full/10.1029/2022JC018845 and T-tide.
  5. A nice harmonic analysis to have will be the tidal harmonic analysis of GNSS buoy data. This is somewhat different than harmonic analysis on tide gauges since in this case the buoy is spatially varying. Very few software actually have this feature.

fix sigrid-publish action

https://github.com/Deltares/hatyan/actions/workflows/sigrid-publish.yml

github sigrid-publish action raises error:

Run ./sigridci/sigridci/sigridci.py --customer deltares --system hatyan --source . --publish
  ./sigridci/sigridci/sigridci.py --customer deltares --system hatyan --source . --publish
  shell: /usr/bin/bash -e {0}
  env:
    SIGRID_CI_TOKEN: ***
[2](https://github.com/Deltares/hatyan/actions/runs/5413674189/jobs/9839593284#step:4:2)02[3](https://github.com/Deltares/hatyan/actions/runs/5413674189/jobs/9839593284#step:4:3)-06-29 1[4](https://github.com/Deltares/hatyan/actions/runs/5413674189/jobs/9839593284#step:4:4):[5](https://github.com/Deltares/hatyan/actions/runs/5413674189/jobs/9839593284#step:4:5)5:13  Starting Sigrid CI
2023-0[6](https://github.com/Deltares/hatyan/actions/runs/5413674189/jobs/9839593284#step:4:7)-29 14:55:13  You are not authenticated to Sigrid (HTTP status 401), please check if your token is valid
Error: Process completed with exit code 1.

Improve code quality

Todo:

  • convert TODO from hatyan functions and example scripts into Github issues (https://github.com/search?q=repo%3ADeltares%2Fhatyan+TODO&type=code, 55 occurences on 15-12-2022, 14 on 13-5-2024)
  • reduce code complexity of analysis_prediction.py
  • clean up code with help of SIG, status badge shows quality status or only finishing of action? Possible to let badge link to public online dashboard with code analysis results? Possible to let badge show color-coded code rating? Possible to (also) get all-code scores back from analysis (now only new code and often N/A)
  • improve code quality by solving sonarcloud severe issues (23 issues at 7-5-2024)
  • move from netcdf4 to xarray for hatyan.write_tsnetcdf() >> not possible since we also use append option. We do need to do something, since hatyan.write_netcdf() has a cognitive complexity of 45 (max allowed is 15)
  • style guide: https://www.python.org/dev/peps/pep-0008/
  • isort/black
  • argument type checking (pydantic?) >> hatyansettings
  • introduce types for essential variables: https://docs.python.org/3/library/typing.html (and update docstring)
  • Check flake8 warnings, maybe i.c.w. black code formatter

Set hatyan global defaults

Like matplotlib: matplotlib.rcParams["savefig.directory"] in matplotlib.

For instance:

  • maybe all ana/pred settings, but also allow for comparison of different settings. Xfac on/off is already necessary in rws scrips, so might not be useful.
  • debug level to limit printing
  • figsizes
  • component splitting: CS_comps_derive and CS_comps_from (i.c.w. #140)

Derive waarnemingssoort from table

According to anafea.f waarnemingssoort is combination of: parametercode-hoedanigheidcode-eenheidcode-domeincode

Implement function to get the WNS from these metadata attrs:

  • quantity
  • vertref
  • unit
  • domeincode? Check ddl metadata what it is (codes are I or F)

Values present in wnstab.f

Consider adding `Components` and `Timeseries` classes

Create a Components and Timeseries and Extremes class to add metadata like so:

import pandas as pd

# https://pandas.pydata.org/pandas-docs/stable/development/extending.html#extending-register-accessors
# @pd.api.extensions.register_dataframe_accessor("hatyan")
class Components:
    def __init__(self, components, amplitudes, phases, 
                 station=None, quantity=None, vertref=None, unit=None, observationtype=None,
                 ana_tstart=None, ana_tstop=None, ana_tstep_min=None,
                 timezone=None,
                 ):
        
        pandas_obj = pd.DataFrame({'A':amplitudes, 'phi_deg':phases}, index=components)
        self._obj = pandas_obj
        self._meta = dict(station=station, quantity=quantity, vertref=vertref, 
                          unit=unit, observationtype=observationtype,
                          ana_tstart=ana_tstart, ana_tstop=ana_tstop, 
                          ana_tstep_min=ana_tstep_min,
                          timezone=timezone)
    
    @classmethod
    def from_dataframe(cls, pandas_obj):
        components = pandas_obj.index
        amplitudes = pandas_obj['A']
        phases = pandas_obj['phi_deg']
        return cls(components, amplitudes, phases)
    
    # duplicate of obj property
    def to_dataframe(self):
        return self._obj
    
    # @staticmethod
    
    @property
    def obj(self):
        return self._obj
    
    @property
    def metadata(self):
        return self._meta

    # desireable to make iloc/concat etc available with 
    # def __setitem__(self, key, value):
    #     self.obj[key] = value
    
    # upon printing this class, return a normal dataframe
    @property
    def __repr__(self):
        return self._obj.__repr__

amps = [1,2,3]
phis = [0,1,2]
comps = ['M2','SA','SM']
comp0_class = Components(comps, amps, phis)
print(comp0_class)
comp1_pd = pd.DataFrame({'A':amps, 'phi_deg':phis}, index=comps)
comp1_pd.loc['A0','A'] = 0.3
comp1_class = Components.from_dataframe(comp1_pd) #TODO: test for nans in data
print(comp1_class)
comp1_class.to_dataframe()
comp1_class.metadata

Todo:

  • Currently metadata is dropped upon ts = ts.iloc[:50], this can probably avoided with class. >> they are now kept since attrs are persistent, so not an issue anymore
  • Also consider creating Metadata class? And Extremes
  • also add settings or print_settings properties
  • this can become quite cumbersome, with not too much added value (besides breaking scripts)
  • a timeseries class could have a plot property, but we would still need a separate hatyan.plot_timeseries() to combine multiple timeseries. This also goes for hatyan.merge_componentgroups() and quite some other functions that require multiple components/timeseries dataframes as input.

Are init/exit functions still needed?

Are the functions in https://github.com/Deltares/hatyan/blob/main/hatyan/wrapper_RWS.py still needed or can be deprecated?

Duplicate issue: ttps://github.com//issues/106

Features to consider:

  • provides toggle for interactive plotting (Qt5agg/Agg)
  • creates folder with unique name (dir_output)
  • does os.chdir(dir_output) so all figures/files are written to dir_output
  • sets mpl savedir, for saving interactive figures
  • copies input script to dir_output
  • takes care of logging (actually hatyan.sh wrote it to logging file, so this is not available anymore)
  • provides script timer

Improve metadata in functions and component file

TODO:

  • two header examples in #93
  • first improvements in #144
  • how to make computer-readable (and writeable) analysis settings for component file (necessary since we will run with fu_alltimes soon)
  • remove commented line in metadata_from_obj >> no, this would make using noos files or netcdf files way more complicated.
  • remove metadata_from_ddlmeta(), not used/tested (also not in kwk repos) and seems not generic
  • avoid writing diafile with tzone!=1 (already done). Mention somewhere "in DONAR zijn alle data in MET" (so prevent writing with other tzone should not be possible)
  • avoid writing compfile with nodalfactors=False or fu_alltimes=True or source!="schureman" until the metadata can be included in header. Even then, we still cannot remove stats_xfac0 from example scripts since we need it when starting from diafiles
  • make source (schureman/foreman) also attribute to be added and checked everywhere
  • update metadata for extremes in calc_HWLW? At least TYP >> not necessary since we distinguish with HWLWcode columns in dataframe. Could be done with additional groepering attribute, but would make it too complex.
  • prevent prediction with xfac=True with compfile with xfac=False (or vice versa). Compare all ana/pred options in prediction() function.

Non-parsed metadata from component file (example headers in #93):

  • *valid for year
  • *SA/SM from different period
  • *theoretische f-factoren (means xfac=0)
  • *component splitting
  • CODE=3 >> not relevant for hatyan2

Missing metadata in component file:

  • timestep_min (PERD start stop 60 is timezone according to anadea.f)
  • nodalfactors on/off (always on)
  • xfac (off if "theoretische" in headerlines)
  • nodalfactors on all timesteps or per year (always per year up to now)
  • MIDD replaced by slotgemiddelde

Missing metadata in diafile:

  • timezone, but donar is always MET

Current status is:

* written with hatyan-2.7.2
* origin :  from timeseries dia file
* nodalfactors :  True
* xfac :  True
* fu_alltimes :  False
* groepering :  NVT
* timestep_min :  60.0
* timestep_unit :  min
* TYP :  TE
* components_sec :  SA, SM imported from component file
STAT  VLISSGN    WATHTE    NAP    cm    1
PERD  20090001  000000  20120031  230000     60
COMP      3
MIDD      1.00
NCOM    95
COMP    0     0.000000     1.000    0.00  A0          
[...]

MWE:

import os
import hatyan

dir_testdata = 'C:\\DATA\\hatyan_data_acceptancetests'
current_station = 'VLISSGN'

#read timeseries, components from analysis, write component file
file_data_comp0 = [os.path.join(dir_testdata, 'predictie2019', f'{current_station}_obs{i}.txt') for i in [1,2,3,4]]
ts_measurements_group0 = hatyan.readts_dia(filename=file_data_comp0, station=current_station)
comp_frommeas_avg_group0 = hatyan.analysis(ts=ts_measurements_group0, const_list='year')
hatyan.write_components(comp_frommeas_avg_group0, 'test_frommeas.cmp')

#read+write component file
file_data_comp1 = os.path.join(dir_testdata, 'predictie2019', f'{current_station}_ana.txt')
COMP_merged = hatyan.read_components(filename=file_data_comp1)
COMP_merged.loc['A0','A'] = 1.23 #TODO: add to metadata if overwritten
hatyan.write_components(COMP_merged, 'test_fromfile.cmp')

Drop Python 3.8 support

  • python 3.8 is EOL in October 2024: https://devguide.python.org/versions
  • many packages already dropped support, overview in: Deltares/dfm_tools#267
  • update supported python versions in pyproject.toml
  • update readme/contributing/installation
  • HMC currently uses Python 3.8.14 and RedHat8 does not support python 3.9. Wait for HMC update to RedHat 9 (update 28-5-2024: is being tested, all systems will be transfered somewhere after summer 2024)

Waiting issues:

  • #156
  • mindeps yml was made quite complex to make requirement solving on py38 possible by exluding some optional dependencies. In py3.9 this will be easier since toml-to-requirements>=0.2.0 (only available for py>=39) supports including only some of the optional requirement lists

Code improvements

Documentation:

  • update docstrings in all exposed functions in modules section
  • add extra examples
  • filter/remove example scripts
  • fix python code coloring in embedded ipynb
  • fix code quality badge (& in url goes wrong): omnilib/sphinx-mdinclude#19
  • check zenodo (should be fixed for releases after 14-5-2024)

hatyansettings:

  • move validation of timestep argument for prediction() to HatyanSettings?
  • clean up HatyanSettings related code, maybe validate arguments in a more charming way
  • also validate const_list and times with hatyan.HatyanSettings, const_list is now validated with check_requestedconsts multiple times (u/f/analysis functions), times is validated in hatyan.prediction()
  • maybe make kwargs in HatyanSettings to reduce number of input arguments. Still raise exception if not all kwargs were popped after init. complexity of this class is also way too high, so maybe first redesign it.
  • deprecate timezone argument for astrog* functions. Derive from tstart/tstop instead?

resample_timeseries:

  • hatyan.resample_timeseries(): round dataframe to 'S' before doing anything, start new datetimeindex on 00 minute
  • make resample_timeseries() work with times slice instead of three keywords? Or deprecate. At least covert timestep_min to timestep and do freq validation there

metadata:

  • Consider updating metadata "origin" attribute of components object: when slotgem is overwritten, with component splitting, with merging of component sets, in case of analysis-perperiod vs. atonce and maybe other settings. Not possible to derive these details from existing/old components file, so is it useful?
  • add vertref/station from metadata to components/timeseries plot (if available), also requires comparing these attrs between input arguments
  • write_dia does not allow vertref/station arguments anymore, add deprecationerror

Code improvements:

  • #57
  • Increase performance with numba (Lru_cache also helps significantly), maybe add pandas performance dependencies
  • Align foreman/schureman, avoids code with same functions. Not easy, but aligning shallowrelations should be possible. Or move from schureman to simplified foreman?
  • Improve analysis prediction varnames? (eg v, u, f)
  • Check if there are any try/except combi's that are not error specific. Raise predefined errors like ValueError or custom errors like like hatyan.analyis_prediction.MatrixConditionTooHigh and catch them properly (not a general 'except:' and no general 'raise Exception()')
  • Numbering of extremes/HWLW: store phasediff-compute-script (numbering_extremes.py) properly

Improve configfiles:

Astrog:

  • Astrog moonriseset sometimes fails: TEMP_astrog_sunrise_lonissue.py
  • Reproduction works, but not everything works for all coordinates
  • Consider moving to pyephem or other python astro package (of astropy of pint voor units)
  • consider moving astrog to separate package

Pass metadata between objects

Metadata in components file is often empty when written by hatyan:

* no metadata available
MIDD   -3.00 cm
NCOM   95
COMP    0     0.000000    -3.000    0.00  A0          
[...]

Original component file contains the following:

  • *SA/SM from different period
  • *component splitting
  • *theoretische f-factoren (means xfac=0)
  • station_name, quantity, vertref, unit, WNS (combination of quantity+vertref?)
  • analysis_start, analysis_stop, timestep_minutes
  • unknown: CODE=3
  • diafiles missing: timezone
  • compfiles missing: timestep_min
  • compfiles missing: how to distinguish nodalfactors on/off
  • compfiles missing: how to distinguish xfac
  • compfiles missing: how to distinguish nodalfactors on all timesteps or per year
  • compfiles missing: MIDD replaced by slotgemiddelde

Examples:

* Analyse over 2009 t/m 2012    Geldig voor jaar 2015                           
* SA en SM uit ber. over 1976...1994 c.q. herleid                               
* Station VLISSGN = Vlissingen                                                  
STAT  VLISSGN       WATHTE            NAP           cm           1              
PERD  20090101  0000  20121231  2300     60                                     
CODE     3                                                                      
MIDD     1.000
NCOM   94
COMP    1      .041069     7.410  216.10  SA      
[...]
* Analyse met componentsplitsing op uurw. maart 2010     
* met theoretische f-factoren en comp.splitsing mbv K13  
* Meetpunt D15 = D15-A platform                          
STAT  DENHDR        WATHTE            NAP           cm           1              
PERD  20090101  0000  20121231  2300     60                                     
CODE     3
MIDD      .000
NCOM   28
COMP    1      .041069     7.790  231.20  SA          
[...]

Todo:

  • add metadata from components file to components object (is pd.DataFrame)
  • add metadata from dia timeseris file to timeseries object (is pd.DataFrame)
  • add metadata to components file header (partly used to work before): station, tstart/tstop (maybe 2x/3x), interval of source data, quantity, vertical reference, unit, timezone
  • add xfac/nodalfactors etc to components file (and to components pd metadata)
  • retain metadata between objects (with analysis, merge_component_groups and prediction)
  • remove cm behind MIDD
  • add translation/check of WNS/quantity/vertref/unit, WNS is not present in wia file so: #135
  • tstart etc in metadata of non-equidistant timeseries from e.g. ext file? Does adding metadata also work there since headercodes are different
  • WATHBRKD or WATHE in compfile? Check if all metadata is correct. Also after process, e.g. ts with WATHE is used for analysis and the resulting components are used for a prediction, but the prediction should be WATHBRKD since it is not a measurement anymore but tide instead.
  • retain/update metadata with crop_timeseries, resample_timeseries and maybe more (and add metadata_from_obj to respective testcases). However, if input ta is not equidistant, maybe not add these arttrs. We can check for equidistance with ts.index.freq!=None.
  • retain metadata with components_timeshift() and change tzone
  • check for tzone==1 when writing tsdia
  • drop adding of manual metadata (e.g. vertref from ana/dia files instead of top of example script) >> done for VLISSGN, todo for other example scripts
  • check if all necessary metadata is available in file. Add escape if no metdata is available in componentfile (for files written with hatyan<=2.7.0)
  • consider removing A0 component
  • run entire testbank

Rest of tasks are moved to follow-up issue: #139

Fix unnoticed failing example scripts

Some example scripts are failing, but this is not noticed by pytest because of the complex dir_output arguments in cases without init/exitRWS.

This happens currently in check_duplicate_components.py and compare_foremanschureman_freqs.py. Previously, export_freq_v0uf_data.py raised unnoticed errors (#58).

This is not captured, since non-initRWS scripts cannot be checked for presence of the NOT_FINISHED file. Solve this by e.g. dropping initRWS and/or simplify testbank method. Maybe it can be resolved by providing a non-unique dir_output in the testbank so initRWS does not create one?
Also, solve the failure itself.

Speed up acceptance tests

Five longest durations:

155.00s call     tests/test_hatyan_examples.py::test_examplescripts[export_freq_v0uf_data]
67.34s call     tests/test_hatyan_examples.py::test_examplescripts[KWK_process] >> moved to kwk repos
40.63s call     tests/test_hatyan_examples.py::test_examplescripts[predictie_2019_19Ycomp4Ydia]
33.44s call     tests/test_hatyan_examples.py::test_examplescripts[numbering_extremes]
31.30s call     tests/test_hatyan_examples.py::test_examplescripts[spatialsummary]

Fix error_bad_lines deprecation warnings and Length mismatch error

Fix deprecation warnings and resulting error in export_freq_v0uf_data.py, blocked by #59

./export_freq_v0uf_data.py:66: FutureWarning: The error_bad_lines argument has been deprecated and will be removed in a future version. Use on_bad_lines in the future.

  hatyan55_freq_raw = pd.read_csv(file_hatyan55, names=colnames_freq, skiprows=21, nrows=219, delim_whitespace=True, error_bad_lines=False, warn_bad_lines=False)
./export_freq_v0uf_data.py:66: FutureWarning: The warn_bad_lines argument has been deprecated and will be removed in a future version. Use on_bad_lines in the future.

  hatyan55_freq_raw = pd.read_csv(file_hatyan55, names=colnames_freq, skiprows=21, nrows=219, delim_whitespace=True, error_bad_lines=False, warn_bad_lines=False)
./export_freq_v0uf_data.py:82: FutureWarning: The error_bad_lines argument has been deprecated and will be removed in a future version. Use on_bad_lines in the future.

  hatyan55_v0uf_raw_allyears = pd.read_csv(file_hatyan55, names=colnames_v0uf, skiprows=270, delim_whitespace=True, error_bad_lines=False, warn_bad_lines=False)
./export_freq_v0uf_data.py:82: FutureWarning: The warn_bad_lines argument has been deprecated and will be removed in a future version. Use on_bad_lines in the future.

  hatyan55_v0uf_raw_allyears = pd.read_csv(file_hatyan55, names=colnames_v0uf, skiprows=270, delim_whitespace=True, error_bad_lines=False, warn_bad_lines=False)
Traceback (most recent call last):
  File "./export_freq_v0uf_data.py", line 107, in <module>
    hatyan55_freq, hatyan55_v0u, hatyan55_f, dood_date_hatyan55, dood_date_hatyan55_v0 = get_hatyan55_values(file_hatyan55)
  File "./export_freq_v0uf_data.py", line 98, in get_hatyan55_values
    hatyan55_v0uf_1y.index = hatyan55_freq.index
  File "/opt/hmc-python/lib/python3.8/site-packages/pandas/core/generic.py", line 5915, in __setattr__
    return object.__setattr__(self, name, value)
  File "pandas/_libs/properties.pyx", line 69, in pandas._libs.properties.AxisProperty.__set__
  File "/opt/hmc-python/lib/python3.8/site-packages/pandas/core/generic.py", line 823, in _set_axis
    self._mgr.set_axis(axis, labels)
  File "/opt/hmc-python/lib/python3.8/site-packages/pandas/core/internals/managers.py", line 230, in set_axis
    self._validate_set_axis(axis, new_labels)
  File "/opt/hmc-python/lib/python3.8/site-packages/pandas/core/internals/base.py", line 70, in _validate_set_axis
    raise ValueError(
ValueError: Length mismatch: Expected axis has 210 elements, new values have 219 elements

improve netcdf HWLW output

  • add coords attribute to variables (time)
  • solve "ValueError: Failed to decode variable 'HWLWno': unable to decode time units 'n-th tidal wave since reference wave at Cadzand on 1-1-2000' with 'the default calendar'. Try opening your dataset with decode_times=False or installing cftime if it is not installed." when opening resulting file with xarray. This can be solved by using long_name instead of units attribute
  • solve "ValueError: Failed to decode variable 'times_astro_HW': unable to decode time units 'minutes since 1900-01-01 00:00:00 ' with 'the default calendar'. Try opening your dataset with decode_times=False or installing cftime if it is not installed.". This happens since there is are NaT times in the variables times_astro_HW and times_astro_LW, since the series with extremes starts with an LW and ends with an HW.
  • for later: add support for multiple stations (different extremes times), so time variable has dimensions (HWLWno and station)
  • coordinates design: https://issuetracker.deltares.nl/browse/FEWS-27093

add HWLW numbering testcase

add 19y/multiyear HWLW numbering validation by:

  • numbering extremes derived from waterlevels (not astro)
  • compare with reference primary extremes from RWS (this dataset is not yet available)
  • computing time/height difference, nans should not be present in the difference
  • convert numbering_FEWS_PG.py to unittest
  • figures of numbering_extremes.py were shifted since the introduction of timezones, why?

create callable hatyan module with `__main__.py`

replace init_RWS() and exit_RWS() with __main__.py python -m hatyan configfile.py --interactive-plots. Can all features be included? Including stdout to file (maybe add --redirect-stdout option). Also add explanation text that prints with hatyan -h (take from docstring). How to exec configfile as part of __main__.py? Maybe add features from old hatyan.sh script?

extract component splitting ampfacs/degincrs from ana-file of linked station

For D15 and others, components are added by using amplitude factors and degree differences between K13-components. The header of the D15 dia-file reads "* met theoretische f-factoren en comp.splitsing mbv K13". Therefore, these factors/differences can be computed from (a subset of) the K13 ana file of K13. This would simplify the user input significantly.

The hardcoded factors/differences from the user input script are:

if current_station == 'D15':
    CS_comps = pd.DataFrame({'CS_comps_derive':['P1','NU2','LABDA2','K2','T2'],
                             'CS_comps_from':['K1','N2','2MN2','S2','S2'],
                             'CS_ampfacs':[0.33,0.22,0.48,0.29,0.05],
                             'CS_degincrs':[-11,-24,174,1,-24]})

This is the subset from the K13 2007 component file:

COMP   33    14.958931     2.869  334.21  P1           
COMP   35    15.041069     8.598  345.06  K1         
COMP   59    28.439730    10.055  189.00  N2          
COMP   60    28.512583     2.224  164.68  NU2      
COMP   70    29.455625     1.562  185.58  LABDA2      
COMP   71    29.528479     3.256   11.33  2MN2        
COMP   76    29.958933      .877  237.59  T2          
COMP   77    30.000000    19.019  261.96  S2          
COMP   79    30.082137     5.489  260.56  K2   

Reproduction of factors/differences was successful:

import pandas as pd
import hatyan

CS_comps = pd.DataFrame({'CS_comps_derive':['P1','NU2','LABDA2','K2','T2'],
                         'CS_comps_from':['K1','N2','2MN2','S2','S2'],
                         'CS_ampfacs':[0.33,0.22,0.48,0.29,0.05],
                         'CS_degincrs':[-11,-24,174,1,-24]})

file_compk13 = r'p:\1209447-kpp-hydraulicaprogrammatuur\hatyan\hatyan_data_acceptancetests\predictie2019\k13APFM_ana_2007.txt'
comp_k13 = hatyan.read_components(file_compk13)

for compd,compf in zip(CS_comps['CS_comps_derive'],CS_comps['CS_comps_from']):
    diff_amp = comp_k13.loc[compd,'A'] / comp_k13.loc[compf,'A']
    diff_deg = comp_k13.loc[compd,'phi_deg'] - comp_k13.loc[compf,'phi_deg']
    print(compd,compf)
    print(f'{diff_amp:.4f}  {diff_deg:.1f}')

Returns:

P1 K1
0.3337  -10.9
NU2 N2
0.2212  -24.3
LABDA2 2MN2
0.4797  174.2
K2 S2
0.2886  -1.4
T2 S2
0.0461  -24.4

Since the splitted components are often the same, combine with #133 and set CS_comps_derive and CS_comps_from global defaults. In that case the only input required is the ana file or a components dataframe.

Relevant stations:

  • A12
  • D15 (from K13)
  • F16
  • F3PFM (from ??)
  • J6
  • K14PFM (from K13)
  • L9PFM
  • NORTHCMRT
  • Q1

Also add metadata that the analysis uses component splitting based on a certain station.

Incorrect HW in timeseries with gap

Incorrect HW/LW in case of timeseries gap.

image

MWE:

import datetime as dt
import hatyan
file_noos = r'c:\DATA\hatyan_data_acceptancetests\other\VLISSGN_waterlevel_20061201_20190101.noos'
ts_meas_pd = hatyan.readts_noos(file_noos)
ts_meas_HWLW = hatyan.calc_HWLW(ts_meas_pd)
fig,(ax1,ax2) = hatyan.plot_timeseries(ts=ts_meas_pd,ts_ext=ts_meas_HWLW)
ax1.set_xlim(dt.datetime(2010,5,1),dt.datetime(2010,6,1))

This is to be expected since the scipy.signal.find_peaks docs mention "This function may return unexpected results for data containing NaNs. To avoid this, NaNs should either be removed or replaced." However, there are no nans in this timeseries, it is just a gap in time.

Ensure installation/pytests with environments work as expected

Some remaining tasks from #59:

  • test environment.yml creation and hatyan installation from github main, to ensure proper dependencies >> fails with python=3.7, works with python=3.8
  • create empty env and hatyan installation from github main, to ensure proper dependencies >> fails with python=3.7, works with python=3.8

getting-started.ipynb fails in binder (io error)

Notebook does work in local jupyter notebook

pydata/xarray#4043 (comment) suggests pydap, but that was also not available locally (also dit not solve the problem in the issue).

pydata/xarray#4043 (comment) suggest installing via conda and checking libnetcdf version, but it dit not solve the problem

libnetcdf and pydap are both not available in binder (importing fails), but also not locally (pydap is but was manually installed and it also worked before that)

check if conda install works, but then it is not pip-only anymore. However, this also was an issue with netdf4-only.

Does work in dfm_tools, so check if those paths work in hatyan_env and if not, what difference is between envs (conda is one difference)

Maybe windows/linux difference?

Replace `times_ext` list and `timestep_min` with `slice(tstart,tstop,tstep_min)`

Or merge times_ext and times_pred_all, but support both pd.DatatimeIndex and slice. Rename argument to times.

Start again since branch has many merging conflicts: https://github.com/Deltares/hatyan/pull/105/files

Also support string tstart etc, use pd.Timestamp to convert them. This also supports timezones:

pd.Timestamp('2000-01-01 00:01 +01:00')
Out[10]: Timestamp('2000-01-01 00:01:00+0100', tz='pytz.FixedOffset(60)')

Fix devenv locally and in github workflow

Some remaining tasks from #59, mainly since hmc env creation raises numpy errors:

  • add pytest, pytest-cov, bumpversion, jupyter, notebook and maybe others to hmc env
  • update environment_hmc.yml with 10feb2023 versions (aslo python+pip versions)
  • updating environment_hmc.yml did not solve the env creation issue, but releasing fixed scipy version (scipy==1.3.1 from August 2019) resolved the issue. Therefore, HMC env has to be updated to newer version (at least 1.5.4, but 1.7.3 also works)
  • update devenv workflow (pytest-devenv.yml) to correspond to environment_hmc.yml. In case of using yml+conda and not pip+environment.txt, check dfm_tools gendocs action
(base) c:\DATA\hatyan_github>conda env create -f environment_hmc.yml
Collecting package metadata (repodata.json): done
Solving environment: done

Downloading and Extracting Packages

Preparing transaction: done
Verifying transaction: done
Executing transaction: done
Installing pip dependencies: / Ran pip subprocess with arguments:
['C:\\Users\\veenstra\\Anaconda3\\envs\\hatyan_hmcenv3\\python.exe', '-m', 'pip', 'install', '-U', '-r', 'c:\\DATA\\hatyan_github\\condaenv.60xyqn5q.requirements.txt', '--exists-action=b']
Pip subprocess output:
Collecting matplotlib==3.5.2
  Using cached matplotlib-3.5.2-cp38-cp38-win_amd64.whl (7.2 MB)
Collecting netCDF4==1.6.0
  Using cached netCDF4-1.6.0-cp38-cp38-win_amd64.whl (3.1 MB)
Collecting numpy==1.23.0
  Using cached numpy-1.23.0-cp38-cp38-win_amd64.whl (14.7 MB)
Collecting pandas==1.4.3
  Using cached pandas-1.4.3-cp38-cp38-win_amd64.whl (10.6 MB)
Collecting pyproj==3.3.1
  Using cached pyproj-3.3.1-cp38-cp38-win_amd64.whl (6.4 MB)
Collecting pytest==7.1.2
  Using cached pytest-7.1.2-py3-none-any.whl (297 kB)
Collecting pytest-cov==3.0.0
  Using cached pytest_cov-3.0.0-py3-none-any.whl (20 kB)
Collecting qtconsole==5.1.1
  Using cached qtconsole-5.1.1-py3-none-any.whl (119 kB)
Collecting requests==2.28.1
  Using cached requests-2.28.1-py3-none-any.whl (62 kB)
Collecting scipy==1.3.1
  Using cached scipy-1.3.1.tar.gz (23.6 MB)
  Installing build dependencies: started
  Installing build dependencies: finished with status 'error'

Pip subprocess error:
  error: subprocess-exited-with-error

  × pip subprocess to install build dependencies did not run successfully.
  │ exit code: 1
  ╰─> [679 lines of output]
      Ignoring numpy: markers 'python_version == "3.5" and platform_system != "AIX"' don't match your environment
      Ignoring numpy: markers 'python_version == "3.6" and platform_system != "AIX"' don't match your environment
      Ignoring numpy: markers 'python_version == "3.5" and platform_system == "AIX"' don't match your environment
      Ignoring numpy: markers 'python_version == "3.6" and platform_system == "AIX"' don't match your environment
      Ignoring numpy: markers 'python_version >= "3.7" and platform_system == "AIX"' don't match your environment
      Collecting wheel
        Using cached wheel-0.38.4-py3-none-any.whl (36 kB)
      Collecting setuptools
        Using cached setuptools-67.2.0-py3-none-any.whl (1.1 MB)
      Collecting Cython>=0.29.2
        Using cached Cython-0.29.33-py2.py3-none-any.whl (987 kB)
      Collecting numpy==1.14.5
        Using cached numpy-1.14.5.zip (4.9 MB)
        Preparing metadata (setup.py): started
        Preparing metadata (setup.py): finished with status 'done'
      Building wheels for collected packages: numpy
        Building wheel for numpy (setup.py): started
        Building wheel for numpy (setup.py): finished with status 'error'
        error: subprocess-exited-with-error

        × python setup.py bdist_wheel did not run successfully.
        │ exit code: 1
        ╰─> [288 lines of output]
            Running from numpy source directory.
            C:\SnapVolumesTemp\MountPoints\{45c63495-0000-0000-0000-100000000000}\{79DE0690-9470-4166-B9EE-4548DC416BBD}\SVROOT\Users\veenstra\AppData\Local\Temp\pip-install-7s_s0krk\numpy_a17179f1229646159af53045096af402\numpy\distutils\misc_util.py:464: SyntaxWarning: "is" with a literal. Did you mean "=="?
              return is_string(s) and ('*' in s or '?' is s)
            blas_opt_info:
            blas_mkl_info:
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries mkl_rt not found in ['C:\\Users\\veenstra\\Anaconda3\\envs\\hatyan_hmcenv3\\lib', 'C:\\', 'C:\\Users\\veenstra\\Anaconda3\\envs\\hatyan_hmcenv3\\libs', 'C:\\Users\\veenstra\\Anaconda3\\Library\\lib']
              NOT AVAILABLE

            blis_info:
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries blis not found in ['C:\\Users\\veenstra\\Anaconda3\\envs\\hatyan_hmcenv3\\lib', 'C:\\', 'C:\\Users\\veenstra\\Anaconda3\\envs\\hatyan_hmcenv3\\libs', 'C:\\Users\\veenstra\\Anaconda3\\Library\\lib']
              NOT AVAILABLE

            openblas_info:
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries openblas not found in ['C:\\Users\\veenstra\\Anaconda3\\envs\\hatyan_hmcenv3\\lib', 'C:\\', 'C:\\Users\\veenstra\\Anaconda3\\envs\\hatyan_hmcenv3\\libs', 'C:\\Users\\veenstra\\Anaconda3\\Library\\lib']
            get_default_fcompiler: matching types: '['gnu', 'intelv', 'absoft', 'compaqv', 'intelev', 'gnu95', 'g95', 'intelvem', 'intelem', 'flang']'
            customize GnuFCompiler
            Could not locate executable g77
            Could not locate executable f77
            customize IntelVisualFCompiler
            Could not locate executable ifort
            Could not locate executable ifl
            customize AbsoftFCompiler
            Could not locate executable f90
            customize CompaqVisualFCompiler
            Could not locate executable DF
            customize IntelItaniumVisualFCompiler
            Could not locate executable efl
            customize Gnu95FCompiler
            Could not locate executable gfortran
            Could not locate executable f95
            customize G95FCompiler
            Could not locate executable g95
            customize IntelEM64VisualFCompiler
            customize IntelEM64TFCompiler
            Could not locate executable efort
            Could not locate executable efc
            customize PGroupFlangCompiler
            Could not locate executable flang
            don't know how to compile Fortran code on platform 'nt'
              NOT AVAILABLE

            atlas_3_10_blas_threads_info:
            Setting PTATLAS=ATLAS
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries tatlas not found in ['C:\\Users\\veenstra\\Anaconda3\\envs\\hatyan_hmcenv3\\lib', 'C:\\', 'C:\\Users\\veenstra\\Anaconda3\\envs\\hatyan_hmcenv3\\libs', 'C:\\Users\\veenstra\\Anaconda3\\Library\\lib']
              NOT AVAILABLE

            atlas_3_10_blas_info:
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries satlas not found in ['C:\\Users\\veenstra\\Anaconda3\\envs\\hatyan_hmcenv3\\lib', 'C:\\', 'C:\\Users\\veenstra\\Anaconda3\\envs\\hatyan_hmcenv3\\libs', 'C:\\Users\\veenstra\\Anaconda3\\Library\\lib']
              NOT AVAILABLE

            atlas_blas_threads_info:
            Setting PTATLAS=ATLAS
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries ptf77blas,ptcblas,atlas not found in ['C:\\Users\\veenstra\\Anaconda3\\envs\\hatyan_hmcenv3\\lib', 'C:\\', 'C:\\Users\\veenstra\\Anaconda3\\envs\\hatyan_hmcenv3\\libs', 'C:\\Users\\veenstra\\Anaconda3\\Library\\lib']
              NOT AVAILABLE

            atlas_blas_info:
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries f77blas,cblas,atlas not found in ['C:\\Users\\veenstra\\Anaconda3\\envs\\hatyan_hmcenv3\\lib', 'C:\\', 'C:\\Users\\veenstra\\Anaconda3\\envs\\hatyan_hmcenv3\\libs', 'C:\\Users\\veenstra\\Anaconda3\\Library\\lib']
              NOT AVAILABLE

            C:\SnapVolumesTemp\MountPoints\{45c63495-0000-0000-0000-100000000000}\{79DE0690-9470-4166-B9EE-4548DC416BBD}\SVROOT\Users\veenstra\AppData\Local\Temp\pip-install-7s_s0krk\numpy_a17179f1229646159af53045096af402\numpy\distutils\system_info.py:624: UserWarning:
                Atlas (http://math-atlas.sourceforge.net/) libraries not found.
                Directories to search for the libraries can be specified in the
                numpy/distutils/site.cfg file (section [atlas]) or by setting
                the ATLAS environment variable.
              self.calc_info()
            blas_info:
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries blas not found in ['C:\\Users\\veenstra\\Anaconda3\\envs\\hatyan_hmcenv3\\lib', 'C:\\', 'C:\\Users\\veenstra\\Anaconda3\\envs\\hatyan_hmcenv3\\libs', 'C:\\Users\\veenstra\\Anaconda3\\Library\\lib']
              NOT AVAILABLE

            C:\SnapVolumesTemp\MountPoints\{45c63495-0000-0000-0000-100000000000}\{79DE0690-9470-4166-B9EE-4548DC416BBD}\SVROOT\Users\veenstra\AppData\Local\Temp\pip-install-7s_s0krk\numpy_a17179f1229646159af53045096af402\numpy\distutils\system_info.py:624: UserWarning:
                Blas (http://www.netlib.org/blas/) libraries not found.
                Directories to search for the libraries can be specified in the
                numpy/distutils/site.cfg file (section [blas]) or by setting
                the BLAS environment variable.
              self.calc_info()
            blas_src_info:
              NOT AVAILABLE

            C:\SnapVolumesTemp\MountPoints\{45c63495-0000-0000-0000-100000000000}\{79DE0690-9470-4166-B9EE-4548DC416BBD}\SVROOT\Users\veenstra\AppData\Local\Temp\pip-install-7s_s0krk\numpy_a17179f1229646159af53045096af402\numpy\distutils\system_info.py:624: UserWarning:
                Blas (http://www.netlib.org/blas/) sources not found.
                Directories to search for the sources can be specified in the
                numpy/distutils/site.cfg file (section [blas_src]) or by setting
                the BLAS_SRC environment variable.
              self.calc_info()
              NOT AVAILABLE

            'svnversion' is not recognized as an internal or external command,
            operable program or batch file.
            non-existing path in 'numpy\\distutils': 'site.cfg'
            'svnversion' is not recognized as an internal or external command,
            operable program or batch file.
            F2PY Version 2
            lapack_opt_info:
            lapack_mkl_info:
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries mkl_rt not found in ['C:\\Users\\veenstra\\Anaconda3\\envs\\hatyan_hmcenv3\\lib', 'C:\\', 'C:\\Users\\veenstra\\Anaconda3\\envs\\hatyan_hmcenv3\\libs', 'C:\\Users\\veenstra\\Anaconda3\\Library\\lib']
              NOT AVAILABLE

            openblas_lapack_info:
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries openblas not found in ['C:\\Users\\veenstra\\Anaconda3\\envs\\hatyan_hmcenv3\\lib', 'C:\\', 'C:\\Users\\veenstra\\Anaconda3\\envs\\hatyan_hmcenv3\\libs', 'C:\\Users\\veenstra\\Anaconda3\\Library\\lib']
              NOT AVAILABLE

            openblas_clapack_info:
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries openblas,lapack not found in ['C:\\Users\\veenstra\\Anaconda3\\envs\\hatyan_hmcenv3\\lib', 'C:\\', 'C:\\Users\\veenstra\\Anaconda3\\envs\\hatyan_hmcenv3\\libs', 'C:\\Users\\veenstra\\Anaconda3\\Library\\lib']
              NOT AVAILABLE

            atlas_3_10_threads_info:
            Setting PTATLAS=ATLAS
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries tatlas,tatlas not found in C:\Users\veenstra\Anaconda3\envs\hatyan_hmcenv3\lib
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries lapack_atlas not found in C:\Users\veenstra\Anaconda3\envs\hatyan_hmcenv3\lib
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries tatlas,tatlas not found in C:\
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries lapack_atlas not found in C:\
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries tatlas,tatlas not found in C:\Users\veenstra\Anaconda3\envs\hatyan_hmcenv3\libs
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries lapack_atlas not found in C:\Users\veenstra\Anaconda3\envs\hatyan_hmcenv3\libs
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries tatlas,tatlas not found in C:\Users\veenstra\Anaconda3\Library\lib
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries lapack_atlas not found in C:\Users\veenstra\Anaconda3\Library\lib
            <class 'numpy.distutils.system_info.atlas_3_10_threads_info'>
              NOT AVAILABLE

            atlas_3_10_info:
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries satlas,satlas not found in C:\Users\veenstra\Anaconda3\envs\hatyan_hmcenv3\lib
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries lapack_atlas not found in C:\Users\veenstra\Anaconda3\envs\hatyan_hmcenv3\lib
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries satlas,satlas not found in C:\
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries lapack_atlas not found in C:\
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries satlas,satlas not found in C:\Users\veenstra\Anaconda3\envs\hatyan_hmcenv3\libs
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries lapack_atlas not found in C:\Users\veenstra\Anaconda3\envs\hatyan_hmcenv3\libs
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries satlas,satlas not found in C:\Users\veenstra\Anaconda3\Library\lib
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries lapack_atlas not found in C:\Users\veenstra\Anaconda3\Library\lib
            <class 'numpy.distutils.system_info.atlas_3_10_info'>
              NOT AVAILABLE

            atlas_threads_info:
            Setting PTATLAS=ATLAS
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries ptf77blas,ptcblas,atlas not found in C:\Users\veenstra\Anaconda3\envs\hatyan_hmcenv3\lib
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries lapack_atlas not found in C:\Users\veenstra\Anaconda3\envs\hatyan_hmcenv3\lib
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries ptf77blas,ptcblas,atlas not found in C:\
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries lapack_atlas not found in C:\
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries ptf77blas,ptcblas,atlas not found in C:\Users\veenstra\Anaconda3\envs\hatyan_hmcenv3\libs
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries lapack_atlas not found in C:\Users\veenstra\Anaconda3\envs\hatyan_hmcenv3\libs
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries ptf77blas,ptcblas,atlas not found in C:\Users\veenstra\Anaconda3\Library\lib
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries lapack_atlas not found in C:\Users\veenstra\Anaconda3\Library\lib
            <class 'numpy.distutils.system_info.atlas_threads_info'>
              NOT AVAILABLE

            atlas_info:
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries f77blas,cblas,atlas not found in C:\Users\veenstra\Anaconda3\envs\hatyan_hmcenv3\lib
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries lapack_atlas not found in C:\Users\veenstra\Anaconda3\envs\hatyan_hmcenv3\lib
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries f77blas,cblas,atlas not found in C:\
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries lapack_atlas not found in C:\
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries f77blas,cblas,atlas not found in C:\Users\veenstra\Anaconda3\envs\hatyan_hmcenv3\libs
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries lapack_atlas not found in C:\Users\veenstra\Anaconda3\envs\hatyan_hmcenv3\libs
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries f77blas,cblas,atlas not found in C:\Users\veenstra\Anaconda3\Library\lib
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries lapack_atlas not found in C:\Users\veenstra\Anaconda3\Library\lib
            <class 'numpy.distutils.system_info.atlas_info'>
              NOT AVAILABLE

            lapack_info:
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries lapack not found in ['C:\\Users\\veenstra\\Anaconda3\\envs\\hatyan_hmcenv3\\lib', 'C:\\', 'C:\\Users\\veenstra\\Anaconda3\\envs\\hatyan_hmcenv3\\libs', 'C:\\Users\\veenstra\\Anaconda3\\Library\\lib']
              NOT AVAILABLE

            C:\SnapVolumesTemp\MountPoints\{45c63495-0000-0000-0000-100000000000}\{79DE0690-9470-4166-B9EE-4548DC416BBD}\SVROOT\Users\veenstra\AppData\Local\Temp\pip-install-7s_s0krk\numpy_a17179f1229646159af53045096af402\numpy\distutils\system_info.py:624: UserWarning:
                Lapack (http://www.netlib.org/lapack/) libraries not found.
                Directories to search for the libraries can be specified in the
                numpy/distutils/site.cfg file (section [lapack]) or by setting
                the LAPACK environment variable.
              self.calc_info()
            lapack_src_info:
              NOT AVAILABLE

            C:\SnapVolumesTemp\MountPoints\{45c63495-0000-0000-0000-100000000000}\{79DE0690-9470-4166-B9EE-4548DC416BBD}\SVROOT\Users\veenstra\AppData\Local\Temp\pip-install-7s_s0krk\numpy_a17179f1229646159af53045096af402\numpy\distutils\system_info.py:624: UserWarning:
                Lapack (http://www.netlib.org/lapack/) sources not found.
                Directories to search for the sources can be specified in the
                numpy/distutils/site.cfg file (section [lapack_src]) or by setting
                the LAPACK_SRC environment variable.
              self.calc_info()
              NOT AVAILABLE

            C:\Users\veenstra\Anaconda3\envs\hatyan_hmcenv3\lib\site-packages\setuptools\_distutils\dist.py:265: UserWarning: Unknown distribution option: 'define_macros'
              warnings.warn(msg)
            running bdist_wheel
            running build
            running config_cc
            unifing config_cc, config, build_clib, build_ext, build commands --compiler options
            running config_fc
            unifing config_fc, config, build_clib, build_ext, build commands --fcompiler options
            running build_src
            build_src
            building py_modules sources
            creating build
            creating build\src.win-amd64-3.8
            creating build\src.win-amd64-3.8\numpy
            creating build\src.win-amd64-3.8\numpy\distutils
            building library "npymath" sources
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            error: Microsoft Visual C++ 14.0 or greater is required. Get it with "Microsoft C++ Build Tools": https://visualstudio.microsoft.com/visual-cpp-build-tools/
            [end of output]

        note: This error originates from a subprocess, and is likely not a problem with pip.
        ERROR: Failed building wheel for numpy
        Running setup.py clean for numpy
        error: subprocess-exited-with-error

        × python setup.py clean did not run successfully.
        │ exit code: 1
        ╰─> [10 lines of output]
            Running from numpy source directory.

            `setup.py clean` is not supported, use one of the following instead:

              - `git clean -xdf` (cleans all files)
              - `git clean -Xdf` (cleans all versioned files, doesn't touch
                                  files that aren't checked into the git repo)

            Add `--force` to your command to use it anyway if you must (unsupported).

            [end of output]

        note: This error originates from a subprocess, and is likely not a problem with pip.
        ERROR: Failed cleaning build dir for numpy
      Failed to build numpy
      Installing collected packages: wheel, setuptools, numpy, Cython
        Running setup.py install for numpy: started
        Running setup.py install for numpy: finished with status 'error'
        error: subprocess-exited-with-error

        × Running setup.py install for numpy did not run successfully.
        │ exit code: 1
        ╰─> [325 lines of output]
            Running from numpy source directory.

            Note: if you need reliable uninstall behavior, then install
            with pip instead of using `setup.py install`:

              - `pip install .`       (from a git repo or downloaded source
                                       release)
              - `pip install numpy`   (last NumPy release on PyPi)


            blas_opt_info:
            blas_mkl_info:
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries mkl_rt not found in ['C:\\Users\\veenstra\\Anaconda3\\envs\\hatyan_hmcenv3\\lib', 'C:\\', 'C:\\Users\\veenstra\\Anaconda3\\envs\\hatyan_hmcenv3\\libs', 'C:\\Users\\veenstra\\Anaconda3\\Library\\lib']
              NOT AVAILABLE

            blis_info:
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries blis not found in ['C:\\Users\\veenstra\\Anaconda3\\envs\\hatyan_hmcenv3\\lib', 'C:\\', 'C:\\Users\\veenstra\\Anaconda3\\envs\\hatyan_hmcenv3\\libs', 'C:\\Users\\veenstra\\Anaconda3\\Library\\lib']
              NOT AVAILABLE

            openblas_info:
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries openblas not found in ['C:\\Users\\veenstra\\Anaconda3\\envs\\hatyan_hmcenv3\\lib', 'C:\\', 'C:\\Users\\veenstra\\Anaconda3\\envs\\hatyan_hmcenv3\\libs', 'C:\\Users\\veenstra\\Anaconda3\\Library\\lib']
            get_default_fcompiler: matching types: '['gnu', 'intelv', 'absoft', 'compaqv', 'intelev', 'gnu95', 'g95', 'intelvem', 'intelem', 'flang']'
            customize GnuFCompiler
            Could not locate executable g77
            Could not locate executable f77
            customize IntelVisualFCompiler
            Could not locate executable ifort
            Could not locate executable ifl
            customize AbsoftFCompiler
            Could not locate executable f90
            customize CompaqVisualFCompiler
            Could not locate executable DF
            customize IntelItaniumVisualFCompiler
            Could not locate executable efl
            customize Gnu95FCompiler
            Could not locate executable gfortran
            Could not locate executable f95
            customize G95FCompiler
            Could not locate executable g95
            customize IntelEM64VisualFCompiler
            customize IntelEM64TFCompiler
            Could not locate executable efort
            Could not locate executable efc
            customize PGroupFlangCompiler
            Could not locate executable flang
            don't know how to compile Fortran code on platform 'nt'
              NOT AVAILABLE

            atlas_3_10_blas_threads_info:
            Setting PTATLAS=ATLAS
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries tatlas not found in ['C:\\Users\\veenstra\\Anaconda3\\envs\\hatyan_hmcenv3\\lib', 'C:\\', 'C:\\Users\\veenstra\\Anaconda3\\envs\\hatyan_hmcenv3\\libs', 'C:\\Users\\veenstra\\Anaconda3\\Library\\lib']
              NOT AVAILABLE

            atlas_3_10_blas_info:
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries satlas not found in ['C:\\Users\\veenstra\\Anaconda3\\envs\\hatyan_hmcenv3\\lib', 'C:\\', 'C:\\Users\\veenstra\\Anaconda3\\envs\\hatyan_hmcenv3\\libs', 'C:\\Users\\veenstra\\Anaconda3\\Library\\lib']
              NOT AVAILABLE

            atlas_blas_threads_info:
            Setting PTATLAS=ATLAS
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries ptf77blas,ptcblas,atlas not found in ['C:\\Users\\veenstra\\Anaconda3\\envs\\hatyan_hmcenv3\\lib', 'C:\\', 'C:\\Users\\veenstra\\Anaconda3\\envs\\hatyan_hmcenv3\\libs', 'C:\\Users\\veenstra\\Anaconda3\\Library\\lib']
              NOT AVAILABLE

            atlas_blas_info:
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries f77blas,cblas,atlas not found in ['C:\\Users\\veenstra\\Anaconda3\\envs\\hatyan_hmcenv3\\lib', 'C:\\', 'C:\\Users\\veenstra\\Anaconda3\\envs\\hatyan_hmcenv3\\libs', 'C:\\Users\\veenstra\\Anaconda3\\Library\\lib']
              NOT AVAILABLE

            C:\SnapVolumesTemp\MountPoints\{45c63495-0000-0000-0000-100000000000}\{79DE0690-9470-4166-B9EE-4548DC416BBD}\SVROOT\Users\veenstra\AppData\Local\Temp\pip-install-7s_s0krk\numpy_a17179f1229646159af53045096af402\numpy\distutils\system_info.py:624: UserWarning:
                Atlas (http://math-atlas.sourceforge.net/) libraries not found.
                Directories to search for the libraries can be specified in the
                numpy/distutils/site.cfg file (section [atlas]) or by setting
                the ATLAS environment variable.
              self.calc_info()
            blas_info:
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries blas not found in ['C:\\Users\\veenstra\\Anaconda3\\envs\\hatyan_hmcenv3\\lib', 'C:\\', 'C:\\Users\\veenstra\\Anaconda3\\envs\\hatyan_hmcenv3\\libs', 'C:\\Users\\veenstra\\Anaconda3\\Library\\lib']
              NOT AVAILABLE

            C:\SnapVolumesTemp\MountPoints\{45c63495-0000-0000-0000-100000000000}\{79DE0690-9470-4166-B9EE-4548DC416BBD}\SVROOT\Users\veenstra\AppData\Local\Temp\pip-install-7s_s0krk\numpy_a17179f1229646159af53045096af402\numpy\distutils\system_info.py:624: UserWarning:
                Blas (http://www.netlib.org/blas/) libraries not found.
                Directories to search for the libraries can be specified in the
                numpy/distutils/site.cfg file (section [blas]) or by setting
                the BLAS environment variable.
              self.calc_info()
            blas_src_info:
              NOT AVAILABLE

            C:\SnapVolumesTemp\MountPoints\{45c63495-0000-0000-0000-100000000000}\{79DE0690-9470-4166-B9EE-4548DC416BBD}\SVROOT\Users\veenstra\AppData\Local\Temp\pip-install-7s_s0krk\numpy_a17179f1229646159af53045096af402\numpy\distutils\system_info.py:624: UserWarning:
                Blas (http://www.netlib.org/blas/) sources not found.
                Directories to search for the sources can be specified in the
                numpy/distutils/site.cfg file (section [blas_src]) or by setting
                the BLAS_SRC environment variable.
              self.calc_info()
              NOT AVAILABLE

            'svnversion' is not recognized as an internal or external command,
            operable program or batch file.
            non-existing path in 'numpy\\distutils': 'site.cfg'
            'svnversion' is not recognized as an internal or external command,
            operable program or batch file.
            F2PY Version 2
            lapack_opt_info:
            lapack_mkl_info:
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries mkl_rt not found in ['C:\\Users\\veenstra\\Anaconda3\\envs\\hatyan_hmcenv3\\lib', 'C:\\', 'C:\\Users\\veenstra\\Anaconda3\\envs\\hatyan_hmcenv3\\libs', 'C:\\Users\\veenstra\\Anaconda3\\Library\\lib']
              NOT AVAILABLE

            openblas_lapack_info:
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries openblas not found in ['C:\\Users\\veenstra\\Anaconda3\\envs\\hatyan_hmcenv3\\lib', 'C:\\', 'C:\\Users\\veenstra\\Anaconda3\\envs\\hatyan_hmcenv3\\libs', 'C:\\Users\\veenstra\\Anaconda3\\Library\\lib']
              NOT AVAILABLE

            openblas_clapack_info:
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries openblas,lapack not found in ['C:\\Users\\veenstra\\Anaconda3\\envs\\hatyan_hmcenv3\\lib', 'C:\\', 'C:\\Users\\veenstra\\Anaconda3\\envs\\hatyan_hmcenv3\\libs', 'C:\\Users\\veenstra\\Anaconda3\\Library\\lib']
              NOT AVAILABLE

            atlas_3_10_threads_info:
            Setting PTATLAS=ATLAS
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries tatlas,tatlas not found in C:\Users\veenstra\Anaconda3\envs\hatyan_hmcenv3\lib
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries lapack_atlas not found in C:\Users\veenstra\Anaconda3\envs\hatyan_hmcenv3\lib
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries tatlas,tatlas not found in C:\
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries lapack_atlas not found in C:\
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries tatlas,tatlas not found in C:\Users\veenstra\Anaconda3\envs\hatyan_hmcenv3\libs
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries lapack_atlas not found in C:\Users\veenstra\Anaconda3\envs\hatyan_hmcenv3\libs
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries tatlas,tatlas not found in C:\Users\veenstra\Anaconda3\Library\lib
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries lapack_atlas not found in C:\Users\veenstra\Anaconda3\Library\lib
            <class 'numpy.distutils.system_info.atlas_3_10_threads_info'>
              NOT AVAILABLE

            atlas_3_10_info:
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries satlas,satlas not found in C:\Users\veenstra\Anaconda3\envs\hatyan_hmcenv3\lib
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries lapack_atlas not found in C:\Users\veenstra\Anaconda3\envs\hatyan_hmcenv3\lib
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries satlas,satlas not found in C:\
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries lapack_atlas not found in C:\
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries satlas,satlas not found in C:\Users\veenstra\Anaconda3\envs\hatyan_hmcenv3\libs
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries lapack_atlas not found in C:\Users\veenstra\Anaconda3\envs\hatyan_hmcenv3\libs
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries satlas,satlas not found in C:\Users\veenstra\Anaconda3\Library\lib
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries lapack_atlas not found in C:\Users\veenstra\Anaconda3\Library\lib
            <class 'numpy.distutils.system_info.atlas_3_10_info'>
              NOT AVAILABLE

            atlas_threads_info:
            Setting PTATLAS=ATLAS
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries ptf77blas,ptcblas,atlas not found in C:\Users\veenstra\Anaconda3\envs\hatyan_hmcenv3\lib
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries lapack_atlas not found in C:\Users\veenstra\Anaconda3\envs\hatyan_hmcenv3\lib
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries ptf77blas,ptcblas,atlas not found in C:\
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries lapack_atlas not found in C:\
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries ptf77blas,ptcblas,atlas not found in C:\Users\veenstra\Anaconda3\envs\hatyan_hmcenv3\libs
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries lapack_atlas not found in C:\Users\veenstra\Anaconda3\envs\hatyan_hmcenv3\libs
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries ptf77blas,ptcblas,atlas not found in C:\Users\veenstra\Anaconda3\Library\lib
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries lapack_atlas not found in C:\Users\veenstra\Anaconda3\Library\lib
            <class 'numpy.distutils.system_info.atlas_threads_info'>
              NOT AVAILABLE

            atlas_info:
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries f77blas,cblas,atlas not found in C:\Users\veenstra\Anaconda3\envs\hatyan_hmcenv3\lib
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries lapack_atlas not found in C:\Users\veenstra\Anaconda3\envs\hatyan_hmcenv3\lib
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries f77blas,cblas,atlas not found in C:\
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries lapack_atlas not found in C:\
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries f77blas,cblas,atlas not found in C:\Users\veenstra\Anaconda3\envs\hatyan_hmcenv3\libs
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries lapack_atlas not found in C:\Users\veenstra\Anaconda3\envs\hatyan_hmcenv3\libs
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries f77blas,cblas,atlas not found in C:\Users\veenstra\Anaconda3\Library\lib
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries lapack_atlas not found in C:\Users\veenstra\Anaconda3\Library\lib
            <class 'numpy.distutils.system_info.atlas_info'>
              NOT AVAILABLE

            lapack_info:
            No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
            customize MSVCCompiler
              libraries lapack not found in ['C:\\Users\\veenstra\\Anaconda3\\envs\\hatyan_hmcenv3\\lib', 'C:\\', 'C:\\Users\\veenstra\\Anaconda3\\envs\\hatyan_hmcenv3\\libs', 'C:\\Users\\veenstra\\Anaconda3\\Library\\lib']
              NOT AVAILABLE

            C:\SnapVolumesTemp\MountPoints\{45c63495-0000-0000-0000-100000000000}\{79DE0690-9470-4166-B9EE-4548DC416BBD}\SVROOT\Users\veenstra\AppData\Local\Temp\pip-install-7s_s0krk\numpy_a17179f1229646159af53045096af402\numpy\distutils\system_info.py:624: UserWarning:
                Lapack (http://www.netlib.org/lapack/) libraries not found.
                Directories to search for the libraries can be specified in the
                numpy/distutils/site.cfg file (section [lapack]) or by setting
                the LAPACK environment variable.
              self.calc_info()
            lapack_src_info:
              NOT AVAILABLE

            C:\SnapVolumesTemp\MountPoints\{45c63495-0000-0000-0000-100000000000}\{79DE0690-9470-4166-B9EE-4548DC416BBD}\SVROOT\Users\veenstra\AppData\Local\Temp\pip-install-7s_s0krk\numpy_a17179f1229646159af53045096af402\numpy\distutils\system_info.py:624: UserWarning:
                Lapack (http://www.netlib.org/lapack/) sources not found.
                Directories to search for the sources can be specified in the
                numpy/distutils/site.cfg file (section [lapack_src]) or by setting
                the LAPACK_SRC environment variable.
              self.calc_info()
              NOT AVAILABLE

            C:\Users\veenstra\Anaconda3\envs\hatyan_hmcenv3\lib\site-packages\setuptools\_distutils\dist.py:265: UserWarning: Unknown distribution option: 'define_macros'
              warnings.warn(msg)
            running install
            C:\Users\veenstra\Anaconda3\envs\hatyan_hmcenv3\lib\site-packages\setuptools\command\install.py:34: SetuptoolsDeprecationWarning: setup.py install is deprecated. Use build and pip and other standards-based tools.
              warnings.warn(
            Traceback (most recent call last):
              File "C:\Users\veenstra\Anaconda3\envs\hatyan_hmcenv3\lib\site-packages\setuptools\_distutils\util.py", line 212, in subst_vars
                return _subst_compat(s).format_map(lookup)
            KeyError: '45c63495-0000-0000-0000-100000000000'

            During handling of the above exception, another exception occurred:

            Traceback (most recent call last):
              File "<string>", line 2, in <module>
              File "<pip-setuptools-caller>", line 34, in <module>
              File "C:\SnapVolumesTemp\MountPoints\{45c63495-0000-0000-0000-100000000000}\{79DE0690-9470-4166-B9EE-4548DC416BBD}\SVROOT\Users\veenstra\AppData\Local\Temp\pip-install-7s_s0krk\numpy_a17179f1229646159af53045096af402\setup.py", line 394, in <module>
                setup_package()
              File "C:\SnapVolumesTemp\MountPoints\{45c63495-0000-0000-0000-100000000000}\{79DE0690-9470-4166-B9EE-4548DC416BBD}\SVROOT\Users\veenstra\AppData\Local\Temp\pip-install-7s_s0krk\numpy_a17179f1229646159af53045096af402\setup.py", line 386, in setup_package
                setup(**metadata)
              File "C:\SnapVolumesTemp\MountPoints\{45c63495-0000-0000-0000-100000000000}\{79DE0690-9470-4166-B9EE-4548DC416BBD}\SVROOT\Users\veenstra\AppData\Local\Temp\pip-install-7s_s0krk\numpy_a17179f1229646159af53045096af402\numpy\distutils\core.py", line 169, in setup
                return old_setup(**new_attr)
              File "C:\Users\veenstra\Anaconda3\envs\hatyan_hmcenv3\lib\site-packages\setuptools\__init__.py", line 87, in setup
                return distutils.core.setup(**attrs)
              File "C:\Users\veenstra\Anaconda3\envs\hatyan_hmcenv3\lib\site-packages\setuptools\_distutils\core.py", line 185, in setup
                return run_commands(dist)
              File "C:\Users\veenstra\Anaconda3\envs\hatyan_hmcenv3\lib\site-packages\setuptools\_distutils\core.py", line 201, in run_commands
                dist.run_commands()
              File "C:\Users\veenstra\Anaconda3\envs\hatyan_hmcenv3\lib\site-packages\setuptools\_distutils\dist.py", line 969, in run_commands
                self.run_command(cmd)
              File "C:\Users\veenstra\Anaconda3\envs\hatyan_hmcenv3\lib\site-packages\setuptools\dist.py", line 1208, in run_command
                super().run_command(command)
              File "C:\Users\veenstra\Anaconda3\envs\hatyan_hmcenv3\lib\site-packages\setuptools\_distutils\dist.py", line 987, in run_command
                cmd_obj.ensure_finalized()
              File "C:\Users\veenstra\Anaconda3\envs\hatyan_hmcenv3\lib\site-packages\setuptools\_distutils\cmd.py", line 111, in ensure_finalized
                self.finalize_options()
              File "C:\SnapVolumesTemp\MountPoints\{45c63495-0000-0000-0000-100000000000}\{79DE0690-9470-4166-B9EE-4548DC416BBD}\SVROOT\Users\veenstra\AppData\Local\Temp\pip-install-7s_s0krk\numpy_a17179f1229646159af53045096af402\numpy\distutils\command\install.py", line 23, in finalize_options
                old_install.finalize_options(self)
              File "C:\Users\veenstra\Anaconda3\envs\hatyan_hmcenv3\lib\site-packages\setuptools\command\install.py", line 45, in finalize_options
                orig.install.finalize_options(self)
              File "C:\Users\veenstra\Anaconda3\envs\hatyan_hmcenv3\lib\site-packages\setuptools\_distutils\command\install.py", line 440, in finalize_options
                self.expand_basedirs()
              File "C:\Users\veenstra\Anaconda3\envs\hatyan_hmcenv3\lib\site-packages\setuptools\_distutils\command\install.py", line 620, in expand_basedirs
                self._expand_attrs(['install_base', 'install_platbase', 'root'])
              File "C:\Users\veenstra\Anaconda3\envs\hatyan_hmcenv3\lib\site-packages\setuptools\_distutils\command\install.py", line 614, in _expand_attrs
                val = subst_vars(val, self.config_vars)
              File "C:\Users\veenstra\Anaconda3\envs\hatyan_hmcenv3\lib\site-packages\setuptools\_distutils\util.py", line 214, in subst_vars
                raise ValueError(f"invalid variable {var}")
            ValueError: invalid variable '45c63495-0000-0000-0000-100000000000'
            [end of output]

        note: This error originates from a subprocess, and is likely not a problem with pip.
      error: legacy-install-failure

      × Encountered error while trying to install package.
      ╰─> numpy

      note: This is an issue with the package mentioned above, not pip.
      hint: See above for output from the failure.
      [end of output]

  note: This error originates from a subprocess, and is likely not a problem with pip.
error: subprocess-exited-with-error

× pip subprocess to install build dependencies did not run successfully.
│ exit code: 1
╰─> See above for output.

note: This error originates from a subprocess, and is likely not a problem with pip.

Create hatyan 2.6.0 release

Tasks:

  • add xarray dependency
  • preferrably fix #71
  • preferrably fix #78
  • update hatyan history.rst
  • update docs (fix github action)
  • run pytest in full local mode and in hmc env. Check acceptance tests output
  • bumpversion minor
  • let AV and NB know about new release

support glob in `readts_dia`

this simplifies configfile:
for file_id in [1,2,3,4] to glob.glob(pattern) (use ? instead of * to avoid including 19y obsfile)

Update devenv to match HMC env

Er is een methodiek om te installeren met venv en zip, waarbij een lokale venv wordt aangemaakt voor de user (of met sudo een centrale) en er via zip PyQt5 en hatyan in worden geïnstalleerd: https://github.com/Deltares/hatyan#installation (OPTION 2). Deze methodiek kan de RPM vervangen, dat maakt het mogelijk om van nieuwere software afhankelijk te zijn (zoals pandas >= 1.2.0)

Op het HMC cluster wordt gebruik gemaakt van Python 3.8.11 (/opt/hmc-python/bin/python) met (selection):
matplotlib==3.5.2
netCDF4==1.6.0
numpy==1.23.0
packaging==21.3
pandas==1.4.3
pyproj==3.3.1
PyQt5==5.12.3
PyQt5-sip==12.11.0
PyQtWebEngine==5.12.1
pytz==2022.1
QtPy==2.1.0
scipy==1.3.1
seaborn==0.11.2
sip==6.6.2
six==1.16.0
spyder==5.1.5
spyder-kernels==2.1.3
urllib3==1.26.9
en: xarray 2022.10.0

Steps:

  • update dependencies to python>=3.7, pandas>=1.2.0
  • remove pandas version check and assume pandas>=1.2.0
  • fix 1018 testcase
  • create environment_hmc.yml with all hmc fixed package versions
  • remove scripts: hatyan.sh, hatyan_rpmbuild.sh, hatyan_rpmbuild_nobinaries.sh, hatyan_python-latest_python3.spec (also remove entry from .bumpversion.cfg)
  • remove workflow file: rpm-build-core.yml
  • remove old install methods from readme.md (option 2 and option 3)

Clean up repo

  • create requirements_hmc.txt and update pytest-hmcenv action
  • create requirements_dev.txt will be used by mkdocs action
  • remove environment.yml files
  • setup.py/setup.cfg: align with dfm_tools
  • remove manifest.in (but test whether data is included in package and otherwise fix in setup.cfg)

Use logging instead of prints

  • convert all warning prints to actual warnings?
  • use logging module/file instead of any prints like in ddlpy (including logging level and set globally how hatyan should print nothing, or to screen, or tot file)
  • enable verbose in cli
  • add silent mode?
  • check logging from cli (initialization/finalization and exec print in console or in STDOUT.txt, depending on users wishes)
  • check acceptance tests (do not timeout with new logging?) >> spatial_plot and numbering_extremes have timeout
  • predictie_2022_frommergedcomp_LAT has no decent error message in pytest anymore
  • overwrite instead of unique-outputdir argument
  • clean up cli function
  • disable matplotlib logging in case of verbose cli or debug logging level in spyder/jupyter: "DEBUG:matplotlib.font_manager:findfont: score(FontEntry(fname='C:\\Windows\\Fonts\\LTYPE.TTF', name='Lucida Sans Typewriter', style='normal', variant='normal', weight=400, stretch='normal', size='scalable')) = 10.05" >> maybe set more things as info and test if warning level suppresses these. In that case it does not have to be solved.
  • maybe set logging format to exlude name? >> no
  • #177
  • add debug logging example to notebook (see below)
  • merge debug prints (see below)

add debug in example notebook:

import logging
hatyan.close("all")
# logging.basicConfig(level=logging.DEBUG)
# hatyan.analysis_prediction.logger.setLevel(logging.DEBUG)
# hatyan.timeseries.logger.setLevel(logging.DEBUG)

merge debug lines into one:

DEBUG:hatyan.analysis_prediction:components used      = 95
DEBUG:hatyan.analysis_prediction:tstart               = 2019-01-01 00:00:00
DEBUG:hatyan.analysis_prediction:tstop                = 2020-01-01 00:00:00
DEBUG:hatyan.analysis_prediction:timestep             = <10 * Minutes>

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.