Giter VIP home page Giter VIP logo

h5z-zfp's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

h5z-zfp's Issues

H5Z-ZFP in combination with zfp 1.0.0: unable to build due to moved header file bitstream.h

The other day, I tried to build (cmake) H5Z-ZFP using branch mcm86-06apr22-use-zfp-vers-patch while using zfp 1.0.0 (has been build using cmake) but unfortunately I did not succeed. The reason of the failure is that the header file bitstream.h of zfp has been moved from include to include/zfp resulting in compilation error at line 42 of H5Zzfp.c. I noticed that for both zfp 0.5.5 and 1.0.0 the file bitstream.h has been included by zfp.h. This makes me wonder could we safely remove the inclusion of bitstream.h (line 42) or is it necessary for an older version of zfp?

Note that pull request #78 and #80 are related to support zfp 1.0.0.

stream_word_bits not a function or function pointer

I can't get H5Z-ZFP to compile on Debian 8. I tried a few different gcc versions; below with gcc-4.8 because it gives a nicer error message. Note that I compiled zfp with -DBIT_STREAM_WORD_TYPE=uint8 and HDF5 with --enable-fortran and --enable-fortran2003.

joeyo@drumkit:~/sw/H5Z-ZFP$ make FC=gfortran CC=gcc-4.8 HDF5_HOME=/home/joeyo/sw/hdf5-1.8.18/my_install ZFP_HOME=/home/joeyo/sw/zfp PREFIX=/usr/local all
cd src; make ZFP_HOME=/home/joeyo/sw/zfp HDF5_HOME=/home/joeyo/sw/hdf5-1.8.18/my_install PREFIX=/usr/local all
make[1]: Entering directory '/home/joeyo/sw/H5Z-ZFP/src'
gcc-4.8 -c H5Zzfp.c -o H5Zzfp_lib.o -DH5Z_ZFP_AS_LIB  -I. -I/home/joeyo/sw/zfp/include -I/home/joeyo/sw/hdf5-1.8.18/my_install/include
H5Zzfp.c: In function ‘H5Z_zfp_can_apply’:
H5Zzfp.c:158:27: error: called object ‘stream_word_bits’ is not a function or function pointer
     if (B stream_word_bits() != 8)
                           ^
In file included from /home/joeyo/sw/zfp/include/zfp.h:66:0,
                 from H5Zzfp.c:35:
/home/joeyo/sw/zfp/include/bitstream.h:11:22: note: declared here
 extern_ const size_t stream_word_bits; /* bit stream granularity */
                      ^
Makefile:9: recipe for target 'H5Zzfp_lib.o' failed
make[1]: *** [H5Zzfp_lib.o] Error 1
make[1]: Leaving directory '/home/joeyo/sw/H5Z-ZFP/src'
Makefile:25: recipe for target 'all' failed
make: *** [all] Error 2
joeyo@drumkit:~/sw/H5Z-ZFP$

Strange result in fixed-accuracy mode

I'm trying the plugin out from h5py for the first time, and I'm having what I think is some strange results in fixed-accuracy mode. It's perfectly possible I'm doing something wrong here.

My input file looks like this:

estan@newton:~$ h5dump -H -p rec-0600mm-0700mm-uncompressed.hdf5
HDF5 "rec-0600mm-0700mm-uncompressed.hdf5" {
GROUP "/" {
   DATASET "reconstruction" {
      DATATYPE  H5T_IEEE_F32LE
      DATASPACE  SIMPLE { ( 500, 300, 300 ) / ( 500, 300, 300 ) }
      STORAGE_LAYOUT {
         CHUNKED ( 32, 19, 38 )
         SIZE 189267968
      }
      FILTERS {
         NONE
      }
      FILLVALUE {
         FILL_TIME H5D_FILL_TIME_ALLOC
         VALUE  0
      }
      ALLOCATION_TIME {
         H5D_ALLOC_TIME_INCR
      }
   }
   DATASET "voxel_size" {
      DATATYPE  H5T_IEEE_F32LE
      DATASPACE  SIMPLE { ( 3, 1 ) / ( 3, 1 ) }
      STORAGE_LAYOUT {
         CONTIGUOUS
         SIZE 12
         OFFSET 189378040
      }
      FILTERS {
         NONE
      }
      FILLVALUE {
         FILL_TIME H5D_FILL_TIME_IFSET
         VALUE  0
      }
      ALLOCATION_TIME {
         H5D_ALLOC_TIME_LATE
      }
   }
}
}

And I have this simple test program:

from sys import argv
from struct import pack, unpack

import h5py

# ZFP filter number
ZFP_FILTER = 32013

# ZFP modes
ZFP_MODE_RATE = 1
ZFP_MODE_PRECISION = 2
ZFP_MODE_ACCURACY = 3
ZFP_MODE_EXPERT = 4

def zfp_rate_opts(rate):
    """Create compression options for ZFP in fixed-rate mode

    The float rate parameter is the number of compressed bits per value.
    """
    rate = pack('<d', rate)            # Pack as IEEE 754 double
    high = unpack('<I', rate[0:4])[0]  # Unpack high bits as unsigned int
    low = unpack('<I', rate[4:8])[0]   # Unpack low bits as unsigned int
    return (ZFP_MODE_RATE, 0, high, low, 0, 0)

def zfp_precision_opts(precision):
    """Create a compression options for ZFP in fixed-precision mode

    The float precision parameter is the number of uncompressed bits per value.
    """
    return (ZFP_MODE_PRECISION, 0, precision, 0, 0, 0)

def zfp_accuracy_opts(accuracy):
    """Create compression options for ZFP in fixed-accuracy mode

    The float accuracy parameter is the absolute error tolarance (e.g. 0.001).
    """
    accuracy = pack('<d', accuracy)        # Pack as IEEE 754 double
    high = unpack('<I', accuracy[0:4])[0]  # Unpack high bits as unsigned int
    low = unpack('<I', accuracy[4:8])[0]   # Unpack low bits as unsigned int
    return (ZFP_MODE_ACCURACY, 0, high, low, 0, 0)

def zfp_expert_opts(minbits, maxbits, maxprec, minexp):
    """Create compression options for ZFP in "expert" mode

    See the ZFP docs for the meaning of the parameters.
    """
    return (ZFP_MODE_EXPERT , 0, minbits, maxbits, maxprec, minexp)

def main():
    if len(argv) != 4:
        print('Usage: zfp-compress-hdf5.py <infile> <dataset> <outfile>')
        exit(1)

    with h5py.File(argv[1], 'r') as infile:
        with h5py.File(argv[3], 'w') as outfile:
            outfile.create_dataset(
                argv[2],
                data=infile[argv[2]].value,
                compression=ZFP_FILTER,
                compression_opts=zfp_accuracy_opts(0.005),
                chunks=(500, 300, 300),
                shuffle=False
            )

if __name__ == '__main__':
    main()

It takes an input HDF5 file, a dataset name, and creates an output HDF5 file with the given dataset name copied from the input, but compressed using zfp in fixed-accuracy mode with tolerance 0.005. Note also that I've hardcoded the HDF5 chunk size to 500x300x300, which is the exact size of my test input, to eliminate any effects of chunking compared to when using the examples/zfp command line tool from zfp.

Running it against my input produces an output file like this:

estan@newton:~$ h5dump -H -p rec-0600mm-0700mm-zfp.hdf5 
HDF5 "rec-0600mm-0700mm-zfp.hdf5" {
GROUP "/" {
   DATASET "reconstruction" {
      DATATYPE  H5T_IEEE_F32LE
      DATASPACE  SIMPLE { ( 500, 300, 300 ) / ( 500, 300, 300 ) }
      STORAGE_LAYOUT {
         CHUNKED ( 500, 300, 300 )
         SIZE 155369971 (1.159:1 COMPRESSION)
      }
      FILTERS {
         USER_DEFINED_FILTER {
            FILTER_ID 32013
            COMMENT H5Z-ZFP-0.3.0 (ZFP-0.5.0) github.com/LLNL/H5Z-ZFP; 
            
            PARAMS { 5242928 91252346 313532218 -1043792 -937099264 67112167 }
         }
      }
      FILLVALUE {
         FILL_TIME H5D_FILL_TIME_ALLOC
         VALUE  0
      }
      ALLOCATION_TIME {
         H5D_ALLOC_TIME_INCR
      }
   }
}
}

Notice how the compression ratio for the compressed dataset is just 1.159:1, which is a lot less than what I would have expected, so I'm pretty sure I'm doing something wrong.

And sure enough, if I first extract the dataset with h5dump to a binary file, then compress that file with examples/zfp from zfp, passing -f -3 500 300 300 -a 0.005 as options, I get a much better compression ratio (3.06:1).

So far I've been unable to see what I'm doing wrong :( First I thought I had misunderstood the cd_values format for the plugin, but I've verified that zfp_stream_set_accuracy is called with the correct tolerance (0.005).

Any ideas?

8-bit stream word size limitation

Hello Mark,

Not really an issue, but more a question. A colleague of mine discovered that the H5Z_ZFP filter requires that the ZFP library needs to build with the CMake option:
-DZFP_BIT_STREAM_WORD_SIZE=8

This are the lines in the code which checks on the stream_word_bits:

H5Z-ZFP/src/H5Zzfp.c

Lines 147 to 149 in 7b34cac

if ((int) B stream_word_bits != 8)
H5Z_ZFP_PUSH_AND_GOTO(H5E_PLINE, H5E_CANTINIT, -1,
"ZFP lib not compiled with -DBIT_STREAM_WORD_TYPE=uint8");

Now, I am wondering what is reason for this 8-bit limitation?

Best regards,
Jan-Willem

memory errors with H5Z-ZFP on 1.12.0 with free() but not H5MM_xfree()

Running on macOS 10.14.6 (Mojave) and HDF5 1.12.0 I am seeing memory errors at this line,

free(*buf);

but only using the filter as a lib, not a dynamic plugin. And, if I switch free() to H5MM_xfree() that fixes it.

test_write_lib(7783,0x11a8be5c0) malloc: *** error for object 0x7fcfe2023230: pointer being freed was not allocated
test_write_lib(7783,0x11a8be5c0) malloc: *** set a breakpoint in malloc_error_break to debug
/bin/bash: line 1:  7783 Abort trap: 6           ./test_write_lib rate=$r zfpmode=1 2>&1 > /dev/null
HDF5-DIAG: Error detected in HDF5 (1.12.0) thread 0:
  #000: ../../src/H5F.c line 793 in H5Fopen(): unable to open file
    major: File accessibility
    minor: Unable to open file
  #001: ../../src/H5VLcallback.c line 3500 in H5VL_file_open(): open failed
    major: Virtual Object Layer
    minor: Can't open object
  #002: ../../src/H5VLcallback.c line 3465 in H5VL__file_open(): open failed
    major: Virtual Object Layer
    minor: Can't open object
  #003: ../../src/H5VLnative_file.c line 100 in H5VL__native_file_open(): unable to open file
    major: File accessibility
    minor: Unable to open file
  #004: ../../src/H5Fint.c line 1716 in H5F_open(): unable to read root group
    major: File accessibility
    minor: Unable to open file
  #005: ../../src/H5Groot.c line 239 in H5G_mkroot(): can't check if symbol table message exists
    major: Symbol table
    minor: Can't get value
  #006: ../../src/H5Omessage.c line 883 in H5O_msg_exists(): unable to protect object header
    major: Object header
    minor: Unable to protect metadata
  #007: ../../src/H5Oint.c line 1082 in H5O_protect(): unable to load object header
    major: Object header
    minor: Unable to protect metadata
  #008: ../../src/H5AC.c line 1312 in H5AC_protect(): H5C_protect() failed
    major: Object cache
    minor: Unable to protect metadata
  #009: ../../src/H5C.c line 2346 in H5C_protect(): can't load entry
    major: Object cache
    minor: Unable to load metadata into cache
  #010: ../../src/H5C.c line 6689 in H5C_load_entry(): incorrect metadatda checksum after all read attempts
    major: Object cache
    minor: Read failed
  #011: ../../src/H5Ocache.c line 220 in H5O__cache_get_final_load_size(): can't deserialize object header prefix
    major: Object header
    minor: Unable to decode value
  #012: ../../src/H5Ocache.c line 1232 in H5O__prefix_deserialize(): bad object header version number
    major: Object header
    minor: Wrong version number
H5Fopen failed at line 144
Library rate test failed for rate=32
make[1]: *** [test-lib-rate] Error 1
make: *** [check] Error 2
[scratlantis:~/silo/zfp_filter/H5Z-ZFP] miller86% lldb test/test_write_lib
(lldb) target create "test/test_write_lib"
Current executable set to 'test/test_write_lib' (x86_64).
(lldb) run rate=8 zfpmode=1
Process 7827 launched: '/Users/miller86/silo/zfp_filter/H5Z-ZFP/test/test_write_lib' (x86_64)
    ifile=""                                  set input filename
    ofile="test_zfp.h5"                      set output filename
    
1D dataset generation arguments...
    npoints=1024             set number of points for 1D dataset
    noise=0.001         set amount of random noise in 1D dataset
    amp=17.7             set amplitude of sinusoid in 1D dataset
    chunk=256                      set chunk size for 1D dataset
    doint=0                              also do integer 1D data
    
ZFP compression paramaters...
    zfpmode=1        (1=rate,2=prec,3=acc,4=expert,5=reversible)
    rate=8                                set rate for rate mode
    acc=0                         set accuracy for accuracy mode
    prec=11                     set precision for precision mode
    minbits=0                        set minbits for expert mode
    maxbits=4171                     set maxbits for expert mode
    maxprec=64                       set maxprec for expert mode
    minexp=-1074                      set minexp for expert mode
                            
Advanced cases...
    highd=0                                          run 4D case
    sixd=0          run 6D extendable case (requires ZFP>=0.5.4)
    help=0                                     this help message
test_write_lib(7827,0x1000ec5c0) malloc: *** error for object 0x101026e30: pointer being freed was not allocated
test_write_lib(7827,0x1000ec5c0) malloc: *** set a breakpoint in malloc_error_break to debug
Process 7827 stopped
* thread #1, queue = 'com.apple.main-thread', stop reason = signal SIGABRT
    frame #0: 0x00007fff590ff2c2 libsystem_kernel.dylib`__pthread_kill + 10
libsystem_kernel.dylib`__pthread_kill:
->  0x7fff590ff2c2 <+10>: jae    0x7fff590ff2cc            ; <+20>
    0x7fff590ff2c4 <+12>: movq   %rax, %rdi
    0x7fff590ff2c7 <+15>: jmp    0x7fff590f9453            ; cerror_nocancel
    0x7fff590ff2cc <+20>: retq   
Target 0: (test_write_lib) stopped.
(lldb) bt
* thread #1, queue = 'com.apple.main-thread', stop reason = signal SIGABRT
  * frame #0: 0x00007fff590ff2c2 libsystem_kernel.dylib`__pthread_kill + 10
    frame #1: 0x00007fff591babf1 libsystem_pthread.dylib`pthread_kill + 284
    frame #2: 0x00007fff590696a6 libsystem_c.dylib`abort + 127
    frame #3: 0x00007fff59178077 libsystem_malloc.dylib`malloc_vreport + 545
    frame #4: 0x00007fff59177e38 libsystem_malloc.dylib`malloc_report + 151
    frame #5: 0x0000000100008ec1 test_write_lib`H5Z_filter_zfp(flags=0, cd_nelmts=4, cd_values=0x0000000100b6bfe8, nbytes=2048, buf_size=0x00007ffeefbfe158, buf=0x00007ffeefbfe200) at H5Zzfp.c:628:9
    frame #6: 0x00000001008595a0 libhdf5.200.dylib`H5Z_pipeline(pline=0x000000010100a4c0, flags=0, filter_mask=0x00007ffeefbfe1a8, edc_read=H5Z_ENABLE_EDC, cb_struct=H5Z_cb_t @ 0x00007ffeefbfe040, nbytes=0x00007ffeefbfe150, buf_size=0x00007ffeefbfe158, buf=0x00007ffeefbfe200) at H5Z.c:1366:26
    frame #7: 0x000000010022a80b libhdf5.200.dylib`H5D__chunk_flush_entry(dset=0x0000000100b692a0, ent=0x0000000100b6e630, reset=false) at H5Dchunk.c:3413:16
    frame #8: 0x0000000100222ccf libhdf5.200.dylib`H5D__chunk_flush(dset=0x0000000100b692a0) at H5Dchunk.c:2824:12
    frame #9: 0x0000000100271150 libhdf5.200.dylib`H5D__flush_real(dataset=0x0000000100b692a0) at H5Dint.c:3233:50
    frame #10: 0x000000010026feca libhdf5.200.dylib`H5D_close(dataset=0x0000000100b692a0) at H5Dint.c:1920:12
    frame #11: 0x000000010083930d libhdf5.200.dylib`H5VL__native_dataset_close(dset=0x0000000100b692a0, dxpl_id=792633534417207304, req=0x0000000000000000) at H5VLnative_dataset.c:634:8
    frame #12: 0x0000000100802e88 libhdf5.200.dylib`H5VL__dataset_close(obj=0x0000000100b692a0, cls=0x0000000100b1d540, dxpl_id=792633534417207304, req=0x0000000000000000) at H5VLcallback.c:2595:8
    frame #13: 0x0000000100802a9c libhdf5.200.dylib`H5VL_dataset_close(vol_obj=0x0000000100b69150, dxpl_id=792633534417207304, req=0x0000000000000000) at H5VLcallback.c:2633:8
    frame #14: 0x000000010027c003 libhdf5.200.dylib`H5D__close_cb(dset_vol_obj=0x0000000100b69150) at H5Dint.c:352:8
    frame #15: 0x0000000100452864 libhdf5.200.dylib`H5I_dec_ref(id=360287970189639681) at H5I.c:1376:41
    frame #16: 0x0000000100452535 libhdf5.200.dylib`H5I_dec_app_ref(id=360287970189639681) at H5I.c:1421:21
    frame #17: 0x00000001004529f5 libhdf5.200.dylib`H5I_dec_app_ref_always_close(id=360287970189639681) at H5I.c:1465:17
    frame #18: 0x0000000100209d7e libhdf5.200.dylib`H5Dclose(dset_id=360287970189639681) at H5D.c:337:8
    frame #19: 0x0000000100003449 test_write_lib`main(argc=3, argv=0x00007ffeefbff7f8) at test_write.c:456:13
    frame #20: 0x00007fff58fc43d5 libdyld.dylib`start + 1
    frame #21: 0x00007fff58fc43d5 libdyld.dylib`start + 1
(lldb) frame 5
invalid command 'frame 5'.
(lldb) up 5
frame #5: 0x0000000100008ec1 test_write_lib`H5Z_filter_zfp(flags=0, cd_nelmts=4, cd_values=0x0000000100b6bfe8, nbytes=2048, buf_size=0x00007ffeefbfe158, buf=0x00007ffeefbfe200) at H5Zzfp.c:628:9
   625 	        if (zsize > msize)
   626 	            H5Z_ZFP_PUSH_AND_GOTO(H5E_RESOURCE, H5E_OVERFLOW, 0, "uncompressed buffer overrun");
   627 	
-> 628 	        free(*buf);
   629 	        *buf = newbuf;
   630 	        newbuf = 0;
   631 	        *buf_size = zsize;

@brtnfld do you have any ideas?

CMake fails to use Autools built zfp and hdf5

When pointing to autotools build hdf5 and zfp,

-D CMAKE_PREFIX_PATH="$HDF5"
-D ZFP_DIR="$ZFP" \

h5z-zfp says it finds HDF5 but does not include the correct links in the compilation.
zfp needs to have *.cmake files, so Autotools builds are not supported for zfp.

Retrieving fixed accuracy parameters from ZFP encoded HDF5 datasets

Hello,

I have gone through great pains to carry fixed accuracy parameter metadata with all of my conversions of data that use ZFP. I often operate on ZFP compressed data and compress the results, and I want to make sure my final accuracy parameters are OK given the original accuracy parameters.

However it occurs to me that at least for a saved ZFP encoded HDF5 dataset, it should be possible to open a HDF5 file with ZFP compressed data and retrieve the original floating point representation of the accuracy parameter for each dataset (I know it is possible to do this with the zfp library). It is not evident how to do this with the H5Z-ZFP interface, but that is what I desire: The ability to retrieve the ZFP fixed accuracy parameter of a H5Z-ZFP compressed HDF5 dataset.

Test suite errors on s390x

Debian build machines report a failure of the test suite on s390x.

The test suite suppresses output, so I run failing tests again manually:

test$ HDF5_PLUGIN_PATH=../src/plugin/ ./test_read_lib ifile=test_zfp_030040.h5 max_reldiff=0.025 
    ifile="test_zfp_030040.h5"                set input filename
    max_absdiff=0                      set maximum absolute diff
    max_reldiff=0.025                  set maximum relative diff
    doint=0                       check integer datasets instead
    ret=0                  return 1 if diffs (0=all,1=abs,2=rel)
    help=0                                     this help message
HDF5-DIAG: Error detected in HDF5 (1.10.8) thread 1:
  #000: ../../../src/H5Dio.c line 186 in H5Dread(): can't read data
    major: Dataset
    minor: Read failed
  #001: ../../../src/H5Dio.c line 584 in H5D__read(): can't read data
    major: Dataset
    minor: Read failed
  #002: ../../../src/H5Dchunk.c line 2544 in H5D__chunk_read(): unable to read raw data chunk
    major: Low-level I/O
    minor: Read failed
  #003: ../../../src/H5Dchunk.c line 3898 in H5D__chunk_lock(): data pipeline read failed
    major: Dataset
    minor: Filter operation failed
  #004: ../../../src/H5Z.c line 1400 in H5Z_pipeline(): filter returned failure during read
    major: Data filters
    minor: Read failed
  #005: H5Zzfp.c line 575 in H5Z_filter_zfp(): can't get ZFP mode/meta
    major: Data filters
    minor: Can't get value
  #006: H5Zzfp.c line 464 in get_zfp_info_from_cd_values(): ZFP codec version mismatch
    major: Data filters
    minor: Bad value
H5Dread failed at line 165, errno=2 (No such file or directory)

Same for ifile=test_zfp_030235.h5 max_reldiff=0.025 and ifile=test_zfp_110050.h5 max_reldiff=0.025.

For the h5repack -f UD=32013,0,4,3,0,3539053052,1062232653 test, I get a ratio of 99.

I tried to extract all information I could think of, let me know if I can help debug this further

Use H5Eregister_class correctly or remove

At some point, I had concluded it was necessary for H5Z-ZFP to register its own error class (H5Eregister_class()) in order to ensure error messages from the HDF5 stack would be maximally interpretable by a user. In particular, the registered error class would include "H5Z-ZFP" in the message strings along with H5Z-ZFP version and ZFP Library version information.

However, I am not sure simply registering a new error class and using it in H5Epush is sufficient. Documentation of H5Epush suggests that major and minor errors "...must be in the same class." which suggests that more work is needed in the filter code with calls to H5Ecreate_msg() before pushing errors involving the new error class will work?

@qkoziol or @epourmal can you comment or point me in right direction?

Spack install error

Installing via spack install h5z-zfp, I encountered the following error. Do you have any advice for getting past this? Thanks!

==> Executing phase: 'edit'
==> Using default implementation: skipping edit phase.
==> Executing phase: 'build'
==> 'make' '-j12' 'PREFIX=/home/drk/code/spack/opt/spack/linux-centos7-x86_64/gcc-4.8.5/h5z-zfp-0.7.0-tchfkud7llxfepfhq2undgrknbjofykt' 'CC=/home/drk/code/spack/lib/spack/env/gcc/gcc' 'HDF5_HOME=/home/drk/code/spack/opt/spack/linux-centos7-x86_64/gcc-4.8.5/hdf5-1.10.1-ri5ggz4lmcc7h2ml5uv4xc6cf6ju5kzu' 'ZFP_HOME=/home/drk/code/spack/opt/spack/linux-centos7-x86_64/gcc-4.8.5/zfp-0.5.1-pihaqxjtwtnes32bjs46l4dvlqzpwg5q' 'FC=/home/drk/code/spack/lib/spack/env/gcc/gfortran' 'all'
cd src; make ZFP_HOME=/home/drk/code/spack/opt/spack/linux-centos7-x86_64/gcc-4.8.5/zfp-0.5.1-pihaqxjtwtnes32bjs46l4dvlqzpwg5q HDF5_HOME=/home/drk/code/spack/opt/spack/linux-centos7-x86_64/gcc-4.8.5/hdf5-1.10.1-ri5ggz4lmcc7h2ml5uv4xc6cf6ju5kzu PREFIX=/home/drk/code/spack/opt/spack/linux-centos7-x86_64/gcc-4.8.5/h5z-zfp-0.7.0-tchfkud7llxfepfhq2undgrknbjofykt all
make[1]: Entering directory `/tmp/drk/spack-stage/spack-stage-_zdn_dju/H5Z-ZFP/src'
/home/drk/code/spack/lib/spack/env/gcc/gcc -c H5Zzfp.c -o H5Zzfp_lib.o -DH5Z_ZFP_AS_LIB -fPIC -I. -I/home/drk/code/spack/opt/spack/linux-centos7-x86_64/gcc-4.8.5/zfp-0.5.1-pihaqxjtwtnes32bjs46l4dvlqzpwg5q/include -I/home/drk/code/spack/opt/spack/linux-centos7-x86_64/gcc-4.8.5/hdf5-1.10.1-ri5ggz4lmcc7h2ml5uv4xc6cf6ju5kzu/include
/home/drk/code/spack/lib/spack/env/gcc/gcc -c H5Zzfp_props.c -o H5Zzfp_props.o -fPIC -I. -I/home/drk/code/spack/opt/spack/linux-centos7-x86_64/gcc-4.8.5/zfp-0.5.1-pihaqxjtwtnes32bjs46l4dvlqzpwg5q/include -I/home/drk/code/spack/opt/spack/linux-centos7-x86_64/gcc-4.8.5/hdf5-1.10.1-ri5ggz4lmcc7h2ml5uv4xc6cf6ju5kzu/include
/home/drk/code/spack/lib/spack/env/gcc/gfortran -c H5Zzfp_props_f.F90 -o H5Zzfp_props_f.o -fPIC -I. -I/home/drk/code/spack/opt/spack/linux-centos7-x86_64/gcc-4.8.5/zfp-0.5.1-pihaqxjtwtnes32bjs46l4dvlqzpwg5q/include -I/home/drk/code/spack/opt/spack/linux-centos7-x86_64/gcc-4.8.5/hdf5-1.10.1-ri5ggz4lmcc7h2ml5uv4xc6cf6ju5kzu/include
/home/drk/code/spack/lib/spack/env/gcc/gcc -c H5Zzfp.c -o H5Zzfp_plugin.o -fPIC -I. -I/home/drk/code/spack/opt/spack/linux-centos7-x86_64/gcc-4.8.5/zfp-0.5.1-pihaqxjtwtnes32bjs46l4dvlqzpwg5q/include -I/home/drk/code/spack/opt/spack/linux-centos7-x86_64/gcc-4.8.5/hdf5-1.10.1-ri5ggz4lmcc7h2ml5uv4xc6cf6ju5kzu/include
H5Zzfp_props_f.F90:4.6:

USE HDF5
1
Fatal Error: Can't open module file 'hdf5.mod' for reading at (1): No such file or directory
make[1]: *** [H5Zzfp_props_f.o] Error 1
make[1]: *** Waiting for unfinished jobs....
make[1]: Leaving directory `/tmp/drk/spack-stage/spack-stage-_zdn_dju/H5Z-ZFP/src'
make: *** [all] Error 2

Re-check endian handling

We have test data from BE and LE systems that is checked in testing using h5diff tool to compare the results. However, codecov is not showing the lines related to this being covered during testing.

Other ZFP features to consider supporting

  • Writes of data already compressed in memory (e.g. zfp compressed array data)
  • Reads of data without decompressing (e.g. instantiate ZFP compressed array)
  • Writes from or reads to GPU memory including threaded acceleration for the ZFP compression
    • Will apps want to write directly from GPU mem or read directly into GPU mem
  • More scalar types (half precision / quad precision)
    • Posits? posithub.org
  • zero dimensional arrays (single scalar per block)

ZFP-filter seems unable to compress the data in hdf5 file

Hi,
I am testing HDF5 filter but encountered the problem.
The generated .zfp.h5 file always has the same as as the original .h5 file.
I am sure that the HDF5_PLUGIN_PATH is set correctly and the zfp's plugin .so file is able to be found in this path. The version of HDF5 library should be fine too (I tested 1.10.3, 1.10.7 and 1.12.1), because I can use this version to run other filters such as H5Z_SZ correctly.
Any ideas?

I described the detailed settings as follows:
I used the following command to compile H5Z-ZFP:

make CC=gcc HDF5_HOME=/home/sdi/Install/hdf5-1.10.3-install ZFP_HOME=/home/sdi/Development/zfp/zfp-0.5.4/zfp PREFIX=/home/sdi/Install/H5Z-ZFP-install all
make CC=gcc HDF5_HOME=/home/sdi/Install/hdf5-1.10.3-install ZFP_HOME=/home/sdi/Development/zfp/zfp-0.5.4/zfp PREFIX=/home/sdi/Install/H5Z-ZFP-install install

Then, /home/sdi/Install/H5Z-ZFP-install has been put in the HDF5_PLUGIN_PATH.
For example:
[sdi@localhost H5Z-ZFP]$ echo $HDF5_PLUGIN_PATH
/home/sdi/Install/sz-2.1.12-install/lib:/home/sdi/Install/H5Z-ZFP-install/plugin:/home/sdi/Development/H5Z/H5Z-SZ3/SZ3/build/sz3-install/lib64
Then,
[sdi@localhost test]$ h5repack -f UD=32013,0,4,1,0,0,1074528256 -i mesh.h5 -o mesh.zfp.h5
[sdi@localhost test]$ ls -al mesh.*
-rw-rw-r-- 1 sdi sdi 1151188 Aug 18 16:08 mesh.h5
-rw-rw-r-- 1 sdi sdi 1152003 Aug 19 16:29 mesh.zfp.h5
[sdi@localhost test]$
The size is not reduced at all.

Got HDF5 error when H5Z-ZFP plugin used with HDF5 1.12.1, NetCDF 4.8.1 and zfp unbiased branch

I recently used "make" building H5Z-ZFP plugin with HDF_HOME is HDF5 1.12.1 and ZFP_HOME is zfp unbiased branch, then I used this plugin with NetCDF 4.8.1 which was built on HDF5 1.12.1. I kept getting HDF5 error at nc_enddef. I have no such problem with HDF5 prebuilt plugin provided at https://portal.hdfgroup.org/display/support/HDF5%201.12.1 on the same HDF5, NetCDF versions. Could you please help me to solve it?

Build error with CMake

With Cmake 3.12.4, when I try to build with:

cmake \
-D CMAKE_BUILD_TYPE=Release \
-D CMAKE_INSTALL_PREFIX=$PWD/h5z-zfp \
-D CMAKE_PREFIX_PATH="$HDF5" \
..

I get the error:

CMake Error at src/CMakeLists.txt:28 (install):
install TARGETS given no ARCHIVE DESTINATION for static library target
"h5z_zfp_static".

On another machine with cmake 3.17 it works.

Software engineering improvements

Consider here several possible improvements to the way the code is designed...

Encode ZFP params only in as expert mode params

With regards to a proposal for a solution for this issue, #100 (comment) makes a compelling case for what to encode cb_values in a clean way, that is, to always encode cb_values as if expert mode was used.

#100 (comment) makes the point that the current situation is needed for a user interface point of view, to avoid making h5repack useable only for users who can juggle with expert values.

I wondered there's a way to extract/compute the expert level values from the zfp_stream_set_{rate|precision|accuracy|reversible} zfp calls. I checked what they do, and indeed, they seem to simply compute and set the 4 "expert mode" integers in the zfp header. So it's possible to turn all other zfpmode inputs into cb_values encoded in expert mode.

A patch proposal would thus be to:

  • Document that only hd5 files encoded in expert mode are portable across different architectures
  • Change print_h5repack_farg.c to always generate cb_values with zfpmode=4 (-f UD=32013,0,5,4,…)
  • Update the examples and tests accordingly
  • Possibly print a warning about portability during encoding/decoding on big endian machines, if zfpmode in cb_values is not 4/expert

This would:

  • always generate files compatible with old versions of H5Z-ZFP, which supports expert mode cb_values
  • stop generating new files that cannot be ported across architectures
  • require no change in the h5repack API: only changes to print_h5repack_farg.c and documentation

Originally posted by @spanezz in #100 (comment)

Also, see this note from @lindstro

Simplify headers and plugin/lib to single header and single source file

There are 6 header files and 2 sources files. I think this could be simplified down to a single source, H5Zzfp.c and single header file, H5Zzfp.h and still maintain all the current build options and interfaces.

This would mean putting all the property functions currently in H5Zzfp_props.c in H5Zzfp.c which would mean that source file contains more than just the code implementing the HDF5 filter interface. But, I think that is fine and there is already some of that going on there because it implements the H5Z_zfp_initialize() and H5Z_zfp_finalize() functions. Callers wishing to use the filter as a library would just explicitly link to the filter instead of only ensuring HDF5 library can find and dlopen it via HDF5_PLUGIN_PATH env. variable.

Should we eliminate GNU Make?

We're maintaining two build interfaces. I tend to use GNU Make but I know others like/want CMake. We get support on Windows only from CMake. I don't like forcing a dependence on CMake in order to build. OTOH, CMake is pretty ubiquitous now...especially if we're conservative about min version requirement. Our CI is uses GNU Make for Linux and CMake for Windows. We don't run tests with CMake...we'd need to plug that together (which may mostly already be addressed in #89).

Better organization of docs

I've got some nitty-gritty details and general usage mixed together in the documentation which I think makes things a bit confusing. So, these could be re-organized a tad better moving a lot of the details to something like an Advanced Issues section.

Update spack recipe to pin versions correctly

Running spack install h5z-zfp gives the following log

==> h5z-zfp: Executing phase: 'edit'
==> h5z-zfp: Executing phase: 'build'
==> [2023-08-10-13:32:58.659792] 'make' '-j8' 'PREFIX=/home/alex/spack/opt/spack/linux-manjaro23-skylake/gcc-13.1.1/h5z-zfp-1.0.1-hceptxfyniryctra5xl6qvyt4bdly7a2' 'CC=/home/alex/spack/opt/spack/linux-manjaro23-skylake/gcc-13.1.1/openmpi-4.1.5-e74unvu5stl6a7xofrep47lsyytjud4f/bin/mpicc' 'HDF5_HOME=/home/alex/spack/opt/spack/linux-manjaro23-skylake/gcc-13.1.1/hdf5-1.14.1-2-olutqyj477ivhudzxiyaaex57h3d6ax7' 'ZFP_HOME=/home/alex/spack/opt/spack/linux-manjaro23-skylake/gcc-13.1.1/zfp-1.0.0-quqjakne3vm6hfusiharnwxtopxvflr2' 'FC=/home/alex/spack/opt/spack/linux-manjaro23-skylake/gcc-13.1.1/openmpi-4.1.5-e74unvu5stl6a7xofrep47lsyytjud4f/bin/mpif90' 'all'
grep: warning: stray \ before #
grep: /H5Zzfp_plugin.h: No such file or directory
cd src; make ZFP_HOME=/home/alex/spack/opt/spack/linux-manjaro23-skylake/gcc-13.1.1/zfp-1.0.0-quqjakne3vm6hfusiharnwxtopxvflr2  PREFIX=/home/alex/spack/opt/spack/linux-manjaro23-skylake/gcc-13.1.1/h5z-zfp-1.0.1-hceptxfyniryctra5xl6qvyt4bdly7a2 all
make[1]: Entering directory '/tmp/alex/spack-stage/spack-stage-h5z-zfp-1.0.1-hceptxfyniryctra5xl6qvyt4bdly7a2/spack-src/src'
grep: warning: stray \ before #
/home/alex/spack/opt/spack/linux-manjaro23-skylake/gcc-13.1.1/openmpi-4.1.5-e74unvu5stl6a7xofrep47lsyytjud4f/bin/mpicc -c H5Zzfp.c -o H5Zzfp_lib.o -DH5Z_ZFP_AS_LIB -fPIC -I. -I/home/alex/spack/opt/spack/linux-manjaro23-skylake/gcc-13.1.1/zfp-1.0.0-quqjakne3vm6hfusiharnwxtopxvflr2/include -I/home/alex/spack/opt/spack/linux-manjaro23-skylake/gcc-13.1.1/hdf5-1.14.1-2-olutqyj477ivhudzxiyaaex57h3d6ax7
/home/alex/spack/opt/spack/linux-manjaro23-skylake/gcc-13.1.1/openmpi-4.1.5-e74unvu5stl6a7xofrep47lsyytjud4f/bin/mpicc -c H5Zzfp_props.c -o H5Zzfp_props.o -fPIC -I. -I/home/alex/spack/opt/spack/linux-manjaro23-skylake/gcc-13.1.1/zfp-1.0.0-quqjakne3vm6hfusiharnwxtopxvflr2/include -I/home/alex/spack/opt/spack/linux-manjaro23-skylake/gcc-13.1.1/hdf5-1.14.1-2-olutqyj477ivhudzxiyaaex57h3d6ax7
/home/alex/spack/opt/spack/linux-manjaro23-skylake/gcc-13.1.1/openmpi-4.1.5-e74unvu5stl6a7xofrep47lsyytjud4f/bin/mpif90 -c H5Zzfp_props_f.F90 -o H5Zzfp_props_f.o  -I. -I/home/alex/spack/opt/spack/linux-manjaro23-skylake/gcc-13.1.1/zfp-1.0.0-quqjakne3vm6hfusiharnwxtopxvflr2/include -I/home/alex/spack/opt/spack/linux-manjaro23-skylake/gcc-13.1.1/hdf5-1.14.1-2-olutqyj477ivhudzxiyaaex57h3d6ax7
/home/alex/spack/opt/spack/linux-manjaro23-skylake/gcc-13.1.1/openmpi-4.1.5-e74unvu5stl6a7xofrep47lsyytjud4f/bin/mpicc -c H5Zzfp.c -o H5Zzfp_plugin.o -fPIC -I. -I/home/alex/spack/opt/spack/linux-manjaro23-skylake/gcc-13.1.1/zfp-1.0.0-quqjakne3vm6hfusiharnwxtopxvflr2/include -I/home/alex/spack/opt/spack/linux-manjaro23-skylake/gcc-13.1.1/hdf5-1.14.1-2-olutqyj477ivhudzxiyaaex57h3d6ax7
H5Zzfp.c:42:10: fatal error: bitstream.h: No such file or directory
   42 | #include "bitstream.h"
      |          ^~~~~~~~~~~~~
compilation terminated.
make[1]: *** [Makefile:11: H5Zzfp_lib.o] Error 1
make[1]: *** Waiting for unfinished jobs....
H5Zzfp.c:42:10: fatal error: bitstream.h: No such file or directory
   42 | #include "bitstream.h"
      |          ^~~~~~~~~~~~~
compilation terminated.
make[1]: *** [Makefile:7: H5Zzfp_plugin.o] Error 1
make[1]: Leaving directory '/tmp/alex/spack-stage/spack-stage-h5z-zfp-1.0.1-hceptxfyniryctra5xl6qvyt4bdly7a2/spack-src/src'
make: *** [Makefile:30: all] Error 2

Attempting to install without the fortran compiler with spack install h5z-zfp~fortran also fails.

Any advice on how to fix my install is appreciated.

Using H5Z-ZFP static library

Hi, I'm trying to include H5Z_ZFP static library to SW4. I'm able to compile ZFP (v1.0.0) and H5Z-ZFP (v1.1.0).

I added the following to SW4's CMakeList:

CMAKE_POLICY(SET CMP0028 NEW)
...
OPTION(USE_ZFP "Use ZFP compression." OFF)
IF(USE_ZFP)
    SET(H5Z_ZFP_USE_STATIC_LIBS ON)
    FIND_PACKAGE(H5Z_ZFP 1.1.0 CONFIG)
    ADD_DEFINITIONS(-DUSE_ZFP)
ENDIF (USE_ZFP)
...
IF(USE_ZFP)
    TARGET_LINK_LIBRARIES(sw4 PRIVATE h5z_zfp::h5z_zfp)
ENDIF (USE_ZFP)

But when I run the cmake command in SW4:

export H5Z_ZFP_DIR=/global/cfs/cdirs/m3354/tang/H5Z-ZFP-1.1.0/build/
cmake -DUSE_HDF5=ON -DUSE_ZFP=ON ..

I got the following error message:

-- Could NOT find H5Z_ZFP: missing: H5Z_ZFP_LIBRARY H5Z_ZFP_INCLUDE_DIR (found /global/cfs/cdirs/m3354/tang/H5Z-ZFP-1.1.0/build/install/lib/cmake/h5z_zfp/h5z_zfp-config.cmake (found suitable version "1.1.0", minimum required is "1.1.0"))
CMake Warning at CMakeLists.txt:78 (FIND_PACKAGE):
  Found package configuration file:

    /global/cfs/cdirs/m3354/tang/H5Z-ZFP-1.1.0/build/install/lib/cmake/h5z_zfp/h5z_zfp-config.cmake

  but it set H5Z_ZFP_FOUND to FALSE so package "H5Z_ZFP" is considered to be
  NOT FOUND.

-- Configuring done
CMake Error at CMakeLists.txt:167 (ADD_EXECUTABLE):
  Target "sw4" links to target "h5z_zfp::h5z_zfp" but the target was not
  found.  Perhaps a find_package() call is missing for an IMPORTED target, or
  an ALIAS target is missing?

Any idea what is going on?

Potential issue with generic properties not getting copied

In the property interface, we use temporary generic properties and do not define all of the HDF5 library callbacks a property list might need. This is lazy. It could result in issues for callers using more sophisticated series of interactions with the HDF5 libary. It should be fixed.

Help with recent update

@brtnfld I am completely perplexed by something I've done here...

a10bd7b

I was doing a pull of master onto my clone. I wound up having to fix a couple of conflicts in docs/cd_vals.rst which I was a big confused by but did. I then did a git push --dry-run --verbose to make sure it wasn't going to push any changes to master ...

git push --dry-run --verbose
Pushing to github.com:LLNL/H5Z-ZFP.git
To github.com:LLNL/H5Z-ZFP.git
   1660d94..a10bd7b  master -> master

So, I did the push...

git push 
Enumerating objects: 1, done.
Counting objects: 100% (1/1), done.
Writing objects: 100% (1/1), 250 bytes | 250.00 KiB/s, done.
Total 1 (delta 0), reused 0 (delta 0), pack-reused 0
remote: Bypassed rule violations for refs/heads/master:
remote: 
remote: - Changes must be made through a pull request.
remote: 
To github.com:LLNL/H5Z-ZFP.git
   1660d94..a10bd7b  master -> master

But, the above referenced commit seems to show I changed 13 files! 😕

add -lm for test_write_plugin

FYI, I found that I needed to add the math library to the link flags for the test_write_plugin (to get sin):

- $(CC) $< -o $@ $(PREPATH)$(HDF5_LIB) $(PREPATH)$(ZFP_LIB) -L$(HDF5_LIB) -L$(ZFP_LIB) -lhdf5 -lzfp $(LDFLAGS)
+ $(CC) $< -o $@ $(PREPATH)$(HDF5_LIB) $(PREPATH)$(ZFP_LIB) -L$(HDF5_LIB) -L$(ZFP_LIB) -lm -lhdf5 -lzfp $(LDFLAGS)

test-h5repack fails on big endian architectures

We recently reported test failures on s390x, see #95. While @spanezz initially reported two failing test cases, the whole bug discussion focused on only one of them. I also figured that these issues are technically independent, so I am now forking the issue.

The second issue (not discussed in #95) is that test-h5repack fails with the message size-ratio. It expects a size ratio >= 200 and gets 99. Together with @spanezz, I tried to get ahold of this, but our progress was limited. Thus we report what we know here:

  • This issue affects big endian architectures in general (e.g. powerpc 32bit, ppc64, sparc64, ...).
  • The repacked mesh_repack.h5 file is slightly larger than the original while it should be significantly smaller.
  • If you try to h5dump mesh_repack.h5 on a little endian without the h5z-zfp plugin, it fails (expected). If you do the same on big endian, it succeeds. This indicates that the repacking step did not actually end up using h5z-zfp.
  • Changing the filter id (UD=...) on the big endian machine to some nonsense does not change its behavior while we would have expected it to error out.

Given these symptoms, do you already have a guess as to where the cause may be located or how we could narrow down the cause?

Use with python

Sorry to ask a really basic question here, but I was wondering if it is possible to use this library with python (presumably usingh5py) or if the only way to use ZFP in python is with https://pypi.org/project/pyzfp/

Add parallel test to CI testing

If its easily possible to load an mpi implementation in the CI system, we should try to run a parallel test too to make sure it all still works fine in parallel.

Fix duplication of compile-time version info

Compile-time version number information is duplicated in H5Zzfp_plugin.h and H5Zzfp_props_f.F90.

@brtnfld is there any reason H5Zzfp_props_f.F90 shouldn't just include H5Zzfp_plugin.h. Maybe I should create a header file that has only that information in it and nothing else?

'include/make.vars.inc' is referring to the stage directory

The contents of the installed make.vars.inc:

ZFP_HOME=/usr/ports/science/h5z-zfp/work/stage/usr/local HDF5_HOME=/usr/ports/science/h5z-zfp/work/stage/usr/local PREFIX=/usr/local

Stage directory is /usr/ports/science/h5z-zfp/work/stage. No installed files can contain the stage directory.

Version 1.0.0.

Help using H5Z-ZFP with netCDF

All,

This might be something never tried before, but I'm hoping I can get help here. With @lindstro's help, I was able to compile both zfp 0.5.2 and H5Z-ZFP and I think I did so correctly. I then, echoing the example noted on this page:

https://www.unidata.ucar.edu/software/netcdf/docs/md__Users_wfisher_Desktop_v4_86_82_netcdf-c_docs_filters.html#NCCOPY

tried to do an nccopy using zfp, but:

(133) $ setenv HDF5_PLUGIN_PATH $SITEAM/Baselibs/ESMA-Baselibs-5.2.0-ZFPTry/x86_64-unknown-linux-gnu/ifort_18.0.3.222-openmpi_3.1.0-gcc_6.3.0/Linux/H5Z-ZFP/plugin
(134) $ $SITEAM/Baselibs/ESMA-Baselibs-5.2.0-ZFPTry/x86_64-unknown-linux-gnu/ifort_18.0.3.222-openmpi_3.1.0-gcc_6.3.0/Linux/bin/nccopy -F 'T,32013,6,3,0,3539053052,1062232653,0,0' stock-JU-2018Sep27-1day-c12.geosgcm_prog.20000415_0000z.nc4 test.nc4
HDF5-DIAG: Error detected in HDF5 (1.10.4) thread 0:
  #000: H5E.c line 607 in H5Eget_class_name(): not a error class ID
    major: Invalid arguments to routine
    minor: Inappropriate type
HDF5-DIAG: Error detected in HDF5 (1.10.4) thread 0:
  #000: H5T.c line 1876 in H5Tget_class(): not a datatype
    major: Invalid arguments to routine
    minor: Inappropriate type
NetCDF: HDF error
Location: file nccopy.c; line 1886
HDF5-DIAG: Error detected in HDF5 (1.10.4) thread 0:
  #000: H5Z.c line 366 in H5Zunregister(): unable to unregister filter
    major: Data filters
    minor: Unable to initialize object
  #001: H5Z.c line 401 in H5Z__unregister(): filter is not registered
    major: Data filters
    minor: Object not found

So obviously I don't know what I'm doing.

I hope you don't mind helping me out here. It's entirely possible I didn't even get the builds correct.

switch to ZFP_CODEC for stored data

We currently store the ZFP library version with stored data.

This is so we can issue an error message if the ZFP lib (or H5Z-ZFP filter) as currently compiled is somehow incompatible with the ZFP lib (or H5Z-ZFP filter) used to store the data.

But we should really use ZFP_CODEC version for this instead.

Fail to install H5Z-ZFP in both my local computer and remote cluster

I use the spack to install H5Z-ZFP and both complain the following problem

==> Error: ProcessError: Command exited with status 2:
'make' '-j16' 'PREFIX=/home/sci/zhimin/spack/opt/spack/linux-centos7-haswell/gcc-4.8.5/h5z-zfp-1.0.1-7rl4zzh2isrkdlcwynkyz7bolpkqj73h' 'CC=/home/sci/zhimin/spack/opt/spack/linux-centos7-haswell/gcc-4.8.5/openmpi-4.1.5-js4vse4qmp7nmvtye7lkcuxfx73wm6ah/bin/mpicc' 'HDF5_HOME=/home/sci/zhimin/spack/opt/spack/linux-centos7-haswell/gcc-4.8.5/hdf5-1.14.1-2-x7ebmkcnokmiiewa72aqoct3onzkrdhm' 'ZFP_HOME=/home/sci/zhimin/spack/opt/spack/linux-centos7-haswell/gcc-4.8.5/zfp-1.0.0-boieacjuaukpfdtk23idw6pkzphtsfrc' 'FC=' 'all'

6 errors found in build log:

4 grep: /H5Zzfp_plugin.h: No such file or directory
5 cd src; make ZFP_HOME=/home/sci/zhimin/spack/opt/spack/linux-centos7-haswell/gcc-4.8.5/zfp-1.0.0-boieacjuaukpfdtk23idw6pkzphtsfrc PREFIX=/home/sci/zhimin/spack/opt/sp
ack/linux-centos7-haswell/gcc-4.8.5/h5z-zfp-1.0.1-7rl4zzh2isrkdlcwynkyz7bolpkqj73h all
6 make[1]: Entering directory /tmp/zhimin/spack-stage/spack-stage-h5z-zfp-1.0.1-7rl4zzh2isrkdlcwynkyz7bolpkqj73h/spack-src/src' 7 /home/sci/zhimin/spack/opt/spack/linux-centos7-haswell/gcc-4.8.5/openmpi-4.1.5-js4vse4qmp7nmvtye7lkcuxfx73wm6ah/bin/mpicc -c H5Zzfp.c -o H5Zzfp_lib.o -DH5Z_ZFP_AS_LIB - fPIC -I. -I/home/sci/zhimin/spack/opt/spack/linux-centos7-haswell/gcc-4.8.5/zfp-1.0.0-boieacjuaukpfdtk23idw6pkzphtsfrc/include -I/home/sci/zhimin/spack/opt/spack/linux- centos7-haswell/gcc-4.8.5/hdf5-1.14.1-2-x7ebmkcnokmiiewa72aqoct3onzkrdhm 8 /home/sci/zhimin/spack/opt/spack/linux-centos7-haswell/gcc-4.8.5/openmpi-4.1.5-js4vse4qmp7nmvtye7lkcuxfx73wm6ah/bin/mpicc -c H5Zzfp_props.c -o H5Zzfp_props.o -fPIC -I. -I/home/sci/zhimin/spack/opt/spack/linux-centos7-haswell/gcc-4.8.5/zfp-1.0.0-boieacjuaukpfdtk23idw6pkzphtsfrc/include -I/home/sci/zhimin/spack/opt/spack/linux-centos7-h aswell/gcc-4.8.5/hdf5-1.14.1-2-x7ebmkcnokmiiewa72aqoct3onzkrdhm 9 /home/sci/zhimin/spack/opt/spack/linux-centos7-haswell/gcc-4.8.5/openmpi-4.1.5-js4vse4qmp7nmvtye7lkcuxfx73wm6ah/bin/mpicc -c H5Zzfp.c -o H5Zzfp_plugin.o -fPIC -I. -I/ho me/sci/zhimin/spack/opt/spack/linux-centos7-haswell/gcc-4.8.5/zfp-1.0.0-boieacjuaukpfdtk23idw6pkzphtsfrc/include -I/home/sci/zhimin/spack/opt/spack/linux-centos7-haswel l/gcc-4.8.5/hdf5-1.14.1-2-x7ebmkcnokmiiewa72aqoct3onzkrdhm 10 H5Zzfp.c:42:23: fatal error: bitstream.h: No such file or directory 11 #include "bitstream.h" 12 ^ 13 H5Zzfp.c:42:23: fatal error: bitstream.h: No such file or directory 14 #include "bitstream.h" 15 ^ 16 compilation terminated. 17 compilation terminated. 18 ar cr libh5zzfp.a H5Zzfp_props.o 19 make[1]: *** [H5Zzfp_lib.o] Error 1 20 make[1]: *** Waiting for unfinished jobs.... 21 make[1]: *** [H5Zzfp_plugin.o] Error 1 22 make[1]: Leaving directory /tmp/zhimin/spack-stage/spack-stage-h5z-zfp-1.0.1-7rl4zzh2isrkdlcwynkyz7bolpkqj73h/spack-src/src'
23 make: *** [all] Error 2

See build log for details:
/tmp/zhimin/spack-stage/spack-stage-h5z-zfp-1.0.1-7rl4zzh2isrkdlcwynkyz7bolpkqj73h/spack-build-out.txt

h5repack does not work with ZFP

I just posted this to the HDF5 user group, as it touches both hdf5 and zfp.

https://forum.hdfgroup.org/t/h5repack-not-working-for-zfp-filter-id-32012/8702

Short description: When using h5repack with the user defined ZFP filter, it fails with "Error occurred while repacking"

The plugin has been determined to work with h5dump on zfp compressed data, so I don't think it's the plugin itself but maybe how it's being used by h5repack. There is an old patch that can no longer be applied to h5repack (https://github.com/LLNL/H5Z-ZFP/blob/master/test/h5repack_parse.patch) that doesn't look like it ever made its way into the mainline code. However this patch can no longer be applied, as it was from 2015.

Support endian targetting

What is endian targetting? This is a way for a data producer to decide, at write time, to have the HDF5 library endian-swap the data before it gets stored to the file. A reason for doing this is to free consumers, who may be on a different endian system, from paying the price for endian-swapping each time the data is read.

The HDF5 library is fully able to handle endian-swapping and often does so transparently to most consumers. In typical operation, the need for endian-swapping is detected only at read time and so only performed during reads when it is necessary. Endian targetting allows a data producer to pre-format the data for the most expected down stream read cases.

The interface for controlling this is handled by the hid_t type_id argument in H5Dcreate which can be used to influence the endianness in the file and hid_t mem_type_id in H5Dwrite which indicates the endienness of data being passed from memory to HDF5. Endian targetting is then the condition that the endianness in the H5Dwrite call does not match the endianness in the H5Dcreate call.

For reasons described here we currently disallow endian-targetting. But, we should probably support it for the same reasons described above. It would free readers from suffering the penalty on read.

But, the more I think about this, I don't think it is even possible. We can't endian-swap the data before handling it to ZFP to compress because we're then giving ZFP funky data; a data format it isn't expecting for the architecture it is currently executing on. Endian swapping after ZFP compresses it doesn't make sense either because at that point we don't have data for which endianness has any relevance.

@qkoziol is endian targetting even possible when combined with filters that are sensitive to endianess?

Please strip binaries after you install them

====> Running Q/A tests (stage-qa)
Warning: 'lib/libh5zzfp.so' is not stripped consider trying INSTALL_TARGET=install-strip or using ${STRIP_CMD}
Warning: 'plugin/libh5zzfp.so' is not stripped consider trying INSTALL_TARGET=install-strip or using ${STRIP_CMD}

Required filter 32013 is not registered

Hi,

I've been playing with H5Z-ZFP with parallel HDF5 when I encountered the error

"#16: H5Z.c line 1264 in H5Z_find(): required filter 32013 is not registered"

when I tried to set up the filter. Could you please help me with the issue or shall I raise this at HDF5 repo?

My configuration is as follows, just in case this is some incompatibility issue:

gcc/9.3.0
openmpi/4.0.4
HDF5/1.12
H5Z-ZFP/v1.0.1

Thanks,

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.