batchatco / go-native-netcdf Goto Github PK
View Code? Open in Web Editor NEWA native Go implementation of NetCDF4.
License: MIT License
A native Go implementation of NetCDF4.
License: MIT License
Whether it supports compression, and whether it can only directly write nc files in hdf5 format
Thanks for this usable library.
I meet a problem when reading variable value If a variable value with compression. I will get wrong value.
I do some debug, found hdf5.go file newRecordReader function those lines maybe has bug
if int64(thisSeg.offset+thisSeg.length) > h5.fileSize {
if int64(thisSeg.offset) >= h5.fileSize {
thisSeg.r = makeFillValueReader(obj, nil, int64(thisSeg.length))
} else {
length := h5.fileSize - int64(thisSeg.offset)
rr := newResetReader(thisSeg.r, length)
thisSeg.r = makeFillValueReader(obj, rr, int64(thisSeg.length))
}
}
this line length := h5.fileSize - int64(thisSeg.offset)' will get wrong length for compression variable values. I change
rr := newResetReader(thisSeg.r, length)to
rr := newResetReader(thisSeg.r, int64(thisSeg.length))` seem get right values.
I am not sure this change is right, please help.
Thanks I have used this library for successfully reading and writing netcdf files for use with ERDDAP.
I'm now faced with reading the GEBCO 2020 Grid, which is 7.5G. lat and lon dimensions are okay, but reading the elevation variable into memory is not ideal.
Nice would be option to read variable without loading data, then a way to directly access a specific value (which will use seek() and read())
// Read the NetCDF variable from the file, but not the values
vr, _ := nc.GetVariableReader("elevation") // VariableReader like Variable but has Value function rather than Values
...
ele = (int16) vr.Value(ilat,ilon) // Value(index ... uint64) does seek() and read()
Below is the ncdump of the GEBCO 2020 netcdf
ncdump ~/Downloads/GEBCO_2020.nc | head -43
netcdf GEBCO_2020 {
dimensions:
lon = 86400 ;
lat = 43200 ;
variables:
double lon(lon) ;
lon:standard_name = "longitude" ;
lon:long_name = "longitude" ;
lon:units = "degrees_east" ;
lon:axis = "X" ;
lon:sdn_parameter_urn = "SDN:P01::ALONZZ01" ;
lon:sdn_parameter_name = "Longitude east" ;
lon:sdn_uom_urn = "SDN:P06::DEGE" ;
lon:sdn_uom_name = "Degrees east" ;
double lat(lat) ;
lat:standard_name = "latitude" ;
lat:long_name = "latitude" ;
lat:units = "degrees_north" ;
lat:axis = "Y" ;
lat:sdn_parameter_urn = "SDN:P01::ALATZZ01" ;
lat:sdn_parameter_name = "Latitude north" ;
lat:sdn_uom_urn = "SDN:P06::DEGN" ;
lat:sdn_uom_name = "Degrees north" ;
short elevation(lat, lon) ;
elevation:standard_name = "height_above_reference_ellipsoid" ;
elevation:long_name = "Elevation relative to sea level" ;
elevation:units = "m" ;
elevation:sdn_parameter_urn = "SDN:P01::BATHHGHT" ;
elevation:sdn_parameter_name = "Sea floor height (above mean sea level) {bathymetric height}" ;
elevation:sdn_uom_urn = "SDN:P06::ULAA" ;
elevation:sdn_uom_name = "Metres" ;
// global attributes:
:Conventions = "CF-1.6" ;
:title = "The GEBCO_2020 Grid - a continuous terrain model for oceans and land at 15 arc-second intervals" ;
:institution = "On behalf of the General Bathymetric Chart of the Oceans (GEBCO), the data are held at the British Oceanographic Data Centre (BODC)." ;
:source = "The GEBCO_2020 Grid is the latest global bathymetric product released by the General Bathymetric Chart of the Oceans (GEBCO) and has been developed through the Nippon Foundation-GEBCO Seabed 2030 Project. This is a collaborative project between the Nippon Foundation of Japan and GEBCO. The Seabed 2030 Project aims to bring together all available bathymetric data to produce the definitive map of the world ocean floor and make it available to all." ;
:history = "Information on the development of the data set and the source data sets included in the grid can be found in the data set documentation available from https://www.gebco.net" ;
:references = "DOI: 10.5285/a29c5465-b138-234d-e053-6c86abc040b9" ;
:comment = "The data in the GEBCO_2020 Grid should not be used for navigation or any purpose relating to safety at sea." ;
:node_offset = 1. ;
data:
Now ,I have a hdf5 file which have 284 chunk in one dataset. I just want to load one chunk data at once for speed,no all dataset data.
So, I want to know whether support load single chunk data. I look forward to your reply. If not suppot, is there a plan to support it in the future?
Thanks.
Hi is there a way of getting a variables missing value from the netcdf-file directly?
I want to exclude all the missing values for a specific variable from a file. Or do i have to hardcode the missing value in a for loop?
var, err := nc.GetVariable("var")
for var.Values == missingvalue{
//ignore
}
thanks you for your project but I have a problem when read netcdf file(abc.nc)。The data can be read correctly by using netcdf-java library。But this library cannot be read correctly。
vr, err := nc.GetVariable("SSI")
vals, ok:= vr.Values.([][]float32) //2-D Array Value Order Error, Duplicate Read。
TL;DR I realized that this library does not allow to read NetCDF variables with more than two dimensions. I really would like to have support for three dimensions and developed a patch. I even created patch to support three to seven dimensions but including is a lot of code and I could not properly test it.
I try to write and read back a 3D array.
The written array dump looks okay:
~/ $ ncdump newdata.nc
netcdf newdata {
dimensions:
x = 2 ;
y = 3 ;
z = 4 ;
variables:
ubyte x(x) ;
ubyte y(y) ;
ubyte z(z) ;
ushort val(x, y, z) ;
// global attributes:
data:
x = 0, 1 ;
y = 0, 1, 2 ;
z = 0, 1, 2, 3 ;
val =
0, 1, 2, 3,
10, 11, 12, 13,
20, 21, 22, 23,
100, 101, 102, 103,
110, 111, 112, 113,
120, 121, 122, 123 ;
}
Note that each data value in the variable 'val' occurs only once.
The problem is with the data that is read back. My test program prints the following like:
2022-08-03 12:35:17.758 val[0]=[[0 1 2 3] [10 11 12 13] [20 21 22 23]] val[((len(val))-(1))]=[[3 10 11 12] [13 20 21 22] [23 100 101 102]] len(val)=2
Note that values from val[0] in val[1] repeat. This is wrong I expect it to look like that:
2022-08-03 12:35:17.758 val[0]=[[0 1 2 3] [10 11 12 13] [20 21 22 23]] val[((len(val))-(1))]=[[100 101 102 103] [110 111 112 113] [120 121 122 123]] len(val)=2
My test code is this. It generates the file newdata.nc using go-native-netcdf and tries to read it back:
package main
import (
"fmt"
"github.com/batchatco/go-native-netcdf/netcdf/api"
"github.com/batchatco/go-native-netcdf/netcdf/cdf"
"github.com/samber/lo"
"runtime"
"time"
)
func timeNow() string {
return time.Now().Format("2006-01-02 15:04:05.000")
}
func writeXarray(fn string) {
cw, err00 := cdf.OpenWriter(fn)
if !((err00) == (nil)) {
fmt.Printf("%v cdf.OpenWriter(fn) err00=%v\n", timeNow(), err00)
panic(err00)
}
fmt.Printf("%v define coordinate x \n", timeNow())
nx := 2
x := lo.Map[int, uint8](lo.Range(2), func(x int, _ int) uint8 {
return uint8(x)
})
err01 := cw.AddVar("x", api.Variable{x, []string{"x"}, nil})
if !((err01) == (nil)) {
fmt.Printf("%v cw.AddVar('x', api.Variable {x, []string {'x'}, nil}) err01=%v\n", timeNow(), err01)
panic(err01)
}
fmt.Printf("%v define coordinate y \n", timeNow())
ny := 3
y := lo.Map[int, uint8](lo.Range(3), func(x int, _ int) uint8 {
return uint8(x)
})
err02 := cw.AddVar("y", api.Variable{y, []string{"y"}, nil})
if !((err02) == (nil)) {
fmt.Printf("%v cw.AddVar('y', api.Variable {y, []string {'y'}, nil}) err02=%v\n", timeNow(), err02)
panic(err02)
}
fmt.Printf("%v define coordinate z \n", timeNow())
nz := 4
z := lo.Map[int, uint8](lo.Range(4), func(x int, _ int) uint8 {
return uint8(x)
})
err03 := cw.AddVar("z", api.Variable{z, []string{"z"}, nil})
if !((err03) == (nil)) {
fmt.Printf("%v cw.AddVar('z', api.Variable {z, []string {'z'}, nil}) err03=%v\n", timeNow(), err03)
panic(err03)
}
fmt.Printf("%v fill 3d dataset val \n", timeNow())
val := make([][][]uint16, nx)
for i := 0; i < nx; (i)++ {
val[i] = make([][]uint16, ny)
for j := 0; j < ny; (j)++ {
val[i][j] = make([]uint16, nz)
for k := 0; k < nz; (k)++ {
val[i][j][k] = uint16((((100) * (i)) + ((10) * (j)) + ((1) * (k))))
}
}
}
err04 := cw.AddVar("val", api.Variable{val, []string{"x", "y", "z"}, nil})
if !((err04) == (nil)) {
fmt.Printf("%v cw.AddVar('val', api.Variable {val, []string {'x', 'y', 'z'}, nil}) err04=%v\n", timeNow(), err04)
panic(err04)
}
defer (func() {
fmt.Printf("%v close fn=%v\n", timeNow(), fn)
err05 := cw.Close()
if !((err05) == (nil)) {
fmt.Printf("%v cw.Close() err05=%v\n", timeNow(), err05)
panic(err05)
}
})()
}
func readXarray(fn string) {
nc, err06 := cdf.Open(fn)
if !((err06) == (nil)) {
fmt.Printf("%v cdf.Open(fn) err06=%v\n", timeNow(), err06)
panic(err06)
}
defer (func() {
fmt.Printf("%v close file fn=%v\n", timeNow(), fn)
nc.Close()
})()
fmt.Printf("%v obtain and show variable 'x' from netcdf file \n", timeNow())
vrX, err07 := nc.GetVariable("x")
if !((err07) == (nil)) {
fmt.Printf("%v nc.GetVariable('x') err07=%v\n", timeNow(), err07)
panic(err07)
}
if (vrX) == (nil) {
fmt.Printf("%v variable x not found \n", timeNow())
panic("")
}
x, hasX := vrX.Values.([]uint8)
if !(hasX) {
fmt.Printf("%v variable x conversion to []uint8 failed \n", timeNow())
panic("")
}
fmt.Printf("%v x[0]=%v x[((len(x))-(1))]=%v len(x)=%v\n", timeNow(), x[0], x[((len(x))-(1))], len(x))
fmt.Printf("%v obtain and show variable 'y' from netcdf file \n", timeNow())
vrY, err08 := nc.GetVariable("y")
if !((err08) == (nil)) {
fmt.Printf("%v nc.GetVariable('y') err08=%v\n", timeNow(), err08)
panic(err08)
}
if (vrY) == (nil) {
fmt.Printf("%v variable y not found \n", timeNow())
panic("")
}
y, hasY := vrY.Values.([]uint8)
if !(hasY) {
fmt.Printf("%v variable y conversion to []uint8 failed \n", timeNow())
panic("")
}
fmt.Printf("%v y[0]=%v y[((len(y))-(1))]=%v len(y)=%v\n", timeNow(), y[0], y[((len(y))-(1))], len(y))
fmt.Printf("%v obtain and show variable 'z' from netcdf file \n", timeNow())
vrZ, err09 := nc.GetVariable("z")
if !((err09) == (nil)) {
fmt.Printf("%v nc.GetVariable('z') err09=%v\n", timeNow(), err09)
panic(err09)
}
if (vrZ) == (nil) {
fmt.Printf("%v variable z not found \n", timeNow())
panic("")
}
z, hasZ := vrZ.Values.([]uint8)
if !(hasZ) {
fmt.Printf("%v variable z conversion to []uint8 failed \n", timeNow())
panic("")
}
fmt.Printf("%v z[0]=%v z[((len(z))-(1))]=%v len(z)=%v\n", timeNow(), z[0], z[((len(z))-(1))], len(z))
fmt.Printf("%v obtain and show variable 'val' from netcdf file \n", timeNow())
vrVal, err10 := nc.GetVariable("val")
if !((err10) == (nil)) {
fmt.Printf("%v nc.GetVariable('val') err10=%v\n", timeNow(), err10)
panic(err10)
}
if (vrVal) == (nil) {
fmt.Printf("%v variable val not found \n", timeNow())
panic("")
}
val, hasVal := vrVal.Values.([][][]uint16)
if !(hasVal) {
fmt.Printf("%v variable val conversion to [][][]uint16 failed \n", timeNow())
panic("")
}
fmt.Printf("%v val[0]=%v val[((len(val))-(1))]=%v len(val)=%v\n", timeNow(), val[0], val[((len(val))-(1))], len(val))
}
func main() {
fmt.Printf("%v netcdf runtime.Version()=%v\n", timeNow(), runtime.Version())
fn := "newdata.nc"
writeXarray(fn)
readXarray(fn)
}
undefined: inernal.NewLogger
undefined: internal.LevelFatal
etc.
for .nc file,data type is actually float,but go-native-netcdf resolves to int16
If a variable has data of multiple dimension, GetSlice will retrieve all the data of the slice of the first dimension.
Example : variable of dimensions time
, latitude
, longitude
GetSlice(0, 2)
will retrieve the data for all latitude
, longitude
, for the time
0 and 1.
I am wondering how this API could be improved to allow slicing all dimensions.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.