Giter VIP home page Giter VIP logo

thummeto / fmisensitivity.jl Goto Github PK

View Code? Open in Web Editor NEW
6.0 2.0 0.0 23 KB

Unfortunately, FMUs (fmi-standard.org) are not differentiable by design. To enable their full potential inside Julia, FMISensitivity.jl makes FMUs fully differentiable, regarding to: states and derivatives | inputs, outputs and other observable variables | parameters | event indicators | explicit time | state change sensitivity by event

License: MIT License

Julia 100.00%
fmi fmu linearization sciml sensitivity uncertainty

fmisensitivity.jl's Introduction

FMI.jl Logo

FMISensitivity.jl

Run Tests Coverage ColPrac: Contributor's Guide on Collaborative Practices for Community Packages

What is FMISensitivity.jl?

Unfortunately, FMUs (fmi-standard.org) are not differentiable by design. To enable their full potential inside Julia, FMISensitivity.jl makes FMUs fully differentiable, regarding to:

  • states and derivatives
  • inputs, outputs and other observable variables
  • parameters
  • event indicators
  • explicit time
  • state change sensitivity by event $\partial x^{+} / \partial x^{-}$ (if paired with FMIFlux.jl)

This opens up to many applications like:

  • FMUs in Scientific Machine Learning, for example as part of Neural(O)DEs or PINNs with FMIFlux.jl
  • gradient-based optimization of FMUs (typically parameters) with FMI.jl (also dynamic optimization)
  • linearization, linear analysis and controller design
  • adding directional derivatives for existing FMUs with the power of Julia AD and FMIExport.jl [Tutorial is WIP]
  • ...

Supported AD-Frameworks are:

  • ForwardDiff
  • FiniteDiff
  • ReverseDiff
  • Zygote [WIP]

Here, FMISensitivity.jl uses everything the FMI-standard and Julia currently offers:

  • FMI built-in directional derivatives [DONE] and adjoint derivatives [WIP]
  • Finite Differences (by FiniteDiff.jl) for FMUs that don't offer sensitivity information, as well as for special derivatives that are not part of the FMI-standard (like e.g. event-indicators or explicit time)
  • coloring based on sparsity information shipped with the FMU [WIP]
  • coloring based on sparsity detection for FMUs without sparsity information [WIP]
  • implicite differentation
  • ...

How can I use FMISensitivity.jl?

FMISensitivity.jl is part of FMIFlux.jl. If you only need FMU sensitivities without anything around and want to keep the dependencies as small as possible, FMISensitivity.jl might be the right way to go. You can install it via:

1. Open a Julia-REPL, switch to package mode using ], activate your preferred environment.

2. Install FMISensitivity.jl:

(@v1) pkg> add FMISensitivity

3. If you want to check that everything works correctly, you can run the tests bundled with FMISensitivity.jl:

(@v1) pkg> test FMISensitivity

4. Have a look inside the examples folder in the examples branch or the examples section of the documentation of the FMI.jl package. All examples are available as Julia-Script (.jl), Jupyter-Notebook (.ipynb) and Markdown (.md).

What FMI.jl-Library should I use?

FMI.jl Family To keep dependencies nice and clean, the original package FMI.jl had been split into new packages:

  • FMI.jl: High level loading, manipulating, saving or building entire FMUs from scratch
  • FMIImport.jl: Importing FMUs into Julia
  • FMIExport.jl: Exporting stand-alone FMUs from Julia Code
  • FMIBase.jl: Common concepts for import and export of FMUs
  • FMICore.jl: C-code wrapper for the FMI-standard
  • FMISensitivity.jl: Static and dynamic sensitivities over FMUs
  • FMIBuild.jl: Compiler/Compilation dependencies for FMIExport.jl
  • FMIFlux.jl: Machine Learning with FMUs
  • FMIZoo.jl: A collection of testing and example FMUs

What Platforms are supported?

FMISensitivity.jl is tested (and testing) under Julia Versions 1.6 LTS and latest on Windows latest and Ubuntu latest. x64 architectures are tested. Mac and x86-architectures might work, but are not tested.

How to cite?

Coming soon ...

Tobias Thummerer, Lars Mikelsons and Josef Kircher. 2021. NeuralFMU: towards structural integration of FMUs into neural networks. Martin Sjölund, Lena Buffoni, Adrian Pop and Lennart Ochel (Ed.). Proceedings of 14th Modelica Conference 2021, Linköping, Sweden, September 20-24, 2021. Linköping University Electronic Press, Linköping (Linköping Electronic Conference Proceedings ; 181), 297-306. DOI: 10.3384/ecp21181297

fmisensitivity.jl's People

Contributors

thummeto avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar

fmisensitivity.jl's Issues

TagBot trigger issue

This issue is used to trigger TagBot; feel free to unsubscribe.

If you haven't already, you should update your TagBot.yml to include issue comment triggers.
Please see this post on Discourse for instructions and more details.

If you'd like for me to do this for you, comment TagBot fix on this issue.
I'll open a PR within a few hours, please be patient!

AD Framework specific dispatch

Different AD frameworks have different requirements.
The common ChainRules interface, that is currently used, is not optimal in terms of allocations (there are allocations necessary because of type restrictions to be compliant with all AD frameworks, even if only one specific framework actually needs them).

  • Implement a dedicated ReverseDiff-Dispatch for eval/rrule
  • inplace eval! function for FMU2Output
  • Improve the common (ChainRules) functions eval/frule/rrule, so they don't need to cover for the ReverseDiff case (like e.g. this issue)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.