berkeleylab / inference-engine Goto Github PK
View Code? Open in Web Editor NEWA deep learning library for use in high-performance computing applications in modern Fortran
License: Other
A deep learning library for use in high-performance computing applications in modern Fortran
License: Other
We should add a short description to the repo. Something along the lines of:
"A Fortran 2018 <package/module> which enables inference using trained dense neural networks."
See DAG.
Here's the error I get trying to compile with ifx:
layer_s.f90 failed.
[ 57%] Compiling...
././src/inference_engine/layer_s.f90(22): error #6197: An assignment of different structure types is invalid. [CONSTRUCT]
layer%neuron = neuron_t(layer_lines, start+1)
-------------------^
There is a workaround for a nagfor bug in the src
dir that can be removed because the latest version of nagfor should have fixed this.
.F90
to .f90
Check do-concurrent
reduce
locality specifier support on different compilers that we are or hope to use to compiler inference-engine
, as we may want to add said locality specifier to the code.
The [learn-microphysics-procedures.f90] example demonstrates how to train a neural network to model functions in the mp_thompson.f90 module. A possible next step could be modeling the rest of the procedures in the same module. The full set of procedures is below.
REAL FUNCTION GAMMLN(XX)
REAL FUNCTION GAMMP(A,X)
REAL FUNCTION WGAMMA(y)
REAL FUNCTION RSLF(P,T)
REAL FUNCTION RSIF(P,T)
SUBROUTINE mp_gt_driver(
qv, qc, qr, qi, qs, qg, ni, nr, &
th, pii, p, dz, dt_in, itimestep, &
RAINNC, RAINNCV, &
SNOWNC, SNOWNCV, &
GRAUPELNC, GRAUPELNCV, SR, &
ids,ide, jds,jde, kds,kde, & ! domain dims
ims,ime, jms,jme, kms,kme, & ! memory dims
its,ite, jts,jte, kts,kte ! tile dims
subroutine mp_thompson
qv1d, qc1d, qi1d, qr1d, qs1d, qg1d, ni1d, &
nr1d, t1d, p1d, dz1d, &
pptrain, pptsnow, pptgraul, pptice, &
kts, kte, dt, i, j
subroutine qr_acr_qg
qr_acr_qg_mpt.dat
subroutine qr_acr_qs
qr_acr_qs_mpt.dat
subroutine freezeH2O
freezeH2O_mpt.dat
subroutine qi_aut_qs
subroutine table_Efrw
subroutine table_Efsw
subroutine table_dropEvap
SUBROUTINE GCF
GAMMCF, A, X , GLN
SUBROUTINE GSER - arguments:
GAMSER, A, X, GLN`Update README based on new directory changes, see comment posted by @rouson in #108 (comment)_
Add cloud-microphysics setup script to CI
See comment posted by @rouson in #108 (comment)_
After new version of ifx released, remove bug workarounds where can.
ifdef
related to associate-stmt
in get_key_value
in inference_engine_s.F90
ifdef
related to associate-stmt
in from_file
in training_configuration_s.f90
Currently, the command
fpm test --compiler flang-new --flag "-mmlir -allow-assumed-rank"
yields the trailing output
[ 57%] inference_engine_m_.f90 done.
error: Semantic errors in ././src/inference_engine/inference_engine_m_.f90
./././src/inference_engine/inference_engine_m_.f90:72:32: error: Result of pure function may not have polymorphic ALLOCATABLE ultimate component '%activation_strategy_'
type(inference_engine_t) inference_engine
^^^^^^^^^^^^^^^^
./././src/inference_engine/inference_engine_m_.f90:30:50: Declaration of 'activation_strategy_'
class(activation_strategy_t), allocatable :: activation_strategy_ ! Strategy Pattern facilitates elemental activation
^^^^^^^^^^^^^^^^^^^^
./././src/inference_engine/inference_engine_m_.f90:104:24: error: Result of pure function may not have polymorphic ALLOCATABLE ultimate component '%activation_strategy_'
type(exchange_t) exchange
^^^^^^^^
./././src/inference_engine/inference_engine_m_.f90:52:50: Declaration of 'activation_strategy_'
class(activation_strategy_t), allocatable :: activation_strategy_ ! Strategy Pattern facilitates elemental activation
^^^^^^^^^^^^^^^^^^^^
<ERROR> Compilation failed for object " src_inference_engine_inference_engine_m_.f90.o "
<ERROR> stopping due to failed compilation
STOP 1
I suggest adding topics such as ann
, neural-network
, feedforward-neural-network
in the About section at https://github.com/BerkeleyLab/inference-engine
git checkout add-file-reader
fpm test --compiler nagfor --flag -fpp
...
NAG Fortran Compiler Release 7.1(Hanzomon) Build 7113
Questionable: ./src/inference_engine_s.f90, line 229: Variable C set but never referenced
Panic: ./src/inference_engine_s.f90: free_TBF_item: Invalid item?
Internal Error -- please report this bug
Abort
<ERROR> Compilation failed for object " src_inference_engine_s.f90.o "
<ERROR>stopping due to failed compilation
STOP 1
./setup.sh runs under directories inference-engine and inference-engine/cloud-microphysics
However ./build/run-fpm.sh run train-cloud-microphysics -- --base training --epochs 10 --start 720
does not run in inference-engine/cloud-microphysics
PR #7 contains a skeletal demonstration of concurrent inference using multiple networks.
space_delimited_strings_to_array()
internal function.The new test could evaluate the XOR truth table concurrently with each do concurrent
iteration using an independent copy of the XOR neural network evaluation. This is potentially faster than the previous sequential evaluation of the truth table in the existing tests.
Each of the three-line do concurrent
/dot_product
blocks in the [infer
] type-bound procedure can be collapsed down to a one-line invocation of matmul
. By default, it seems likely that most compilers would generate faster code with matmul
, but it's best to be able to compare the two approaches with multiple compilers on multiple platforms to determine whether or not matmul
is always superior. Scenarios to consider:
matmul
implementation.matmul
.do concurrent
with with various optimization flags (-O...
) set.do concurrent
to a GPU.matmul
to a GPU.Option 4 and possibly option 5 are available with the Intel ifx
compiler as of the 2022.3 version of oneaAPI released two weeks ago. Option 4 and possibly option 5 has also been available with the NVIDIA nvfortran
compiler since about 2 years ago, but nvfortran
has limited support for Fortran 2008 and extremely limited support for Fortran 2018. I believe our only 2008 features are do concurrent
(which nvfortran
supports), module function
/module subroutine
interface bodies, and submodule
, which nvfortran
might or might not support. Working around the latter two features would require a lot of code revision, but would not be too painful.
Let's develop an alternative implementation of infer
that does this and enable switching between the two with a C preprocessor macro something like
#ifdef DO_CONCURRENT_INFER
module procedure infer
! (concurrent infer implementation)
end procedure
#else
module procedure infer
! (matmul implementation)
end procedure
#endif
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.