Giter VIP home page Giter VIP logo

hssmatrices.jl's Introduction

HssMatrices.jl

Build status (Github Actions) codecov.io DOI

HssMatrices is a Julia package for hierarchically semiseparable (HSS) matrices. These matrices are a type of hierarchically structured matrices, that arise in the context of solving PDEs numerically, among others. HssMatrices.jl is intendend to help users experiment with HSS matrices and related algorithms. HssMatrices.jl implements compression routines, HSS arithmetic, as well as helpful routines for clustering and visualization. These matrices have structures similar to the one in the illustration below. Plotranks

Getting started

Let us generate a simple Kernel matrix and convert it into HSS format:

using LinearAlgebra
using HssMatrices

K(x,y) = (x-y) != 0 ? 1/(x-y) : 1.
A = [ K(x,y) for x=-1:0.001:1, y=-1:0.001:1]
hssA = hss(A)

This will automatically build a cluster tree and compress the matrix accordingly. hss() acts as a smart constructor, which will construct the matrix depending on the supplied matrix and parameters. We can either pass these parameters, for instance by doing:

hssA = hss(A, leafsize=64, atol=1e-6, rtol=1e-6)

It can be handy to set default values for those parameters once and forget about them later on. This can be achieved bt doing:

HssMatrices.setopts!(leafsize=64)

Compression/Recompression

But what if you want to choose the compression algorithm yourself? Instead of calling hss, HssMatrices provides access to deterministic and randomized HSS compression routines, as well as the recompression routine. In order to call these, we first have to specify a clustering of the degrees of freedom. We provide the function bisection_cluster, to form a cluster tree of the degrees of freedom. If we use AbstractTrees.jl, we can also print them with print_tree:

using AbstractTrees
cl = bisection_cluster(1:m, leafsize=32)
print_tree(cl)

Once we have created row- and column-clusters, we can move on to compress our matrix. Direct compression can be achieved by calling compress:

rcl = bisection_cluster(1:m)
rcl = bisection_cluster(1:n)
hssA = compress(A, rcl, ccl, atol=1e-9, rtol=1e-9)

The tolerances in HssMatrices are handled in a way that the algorithms stop once either the absolute norm is below atol or once the relative norm is below rtol. In order to enforce that only one of the two criteria is met, we can set the other criterion to 0.

Apart from the direct compression routine, HssMatrices also implements the randomized compression routine developed by Per-Gunnar Martinsson. We can call this routine by either calling randcompress or randcompress_adaptive. The latter starts with a rank estimate of kest=10 and increases the estimated rank of the random sampling until the estimate of the norm is below the respective tolerance.

hssB = randcompress(A, rcl, ccl, kest=10);
hssB = randcompress_adaptive(A, rcl, ccl);

It is often useful to be able to call this indirect compression matrix without explicitly constructing a matrix. To this end, HssMatrices implements the LinearMap, type which is derived from the LinearOperator type defined in LowRankApproximation.jl. This type contains two functions for multiplication with the matrix and it's routine respectively, as well as one function for accessing individual indices of the LinearMap.

Id(i,j) = Matrix{Float64}(i.*ones(length(j))' .== ones(length(i)).*j')
IdOp = LinearMap{Float64}(n, n, (y,_,x) -> x, (y,_,x) -> x, (i,j) -> Id(i,j))
hssI = randcompress(IdOp, ccl, ccl, 0)

Finally, basic arithmetic on hierarchical matrices often requires frequent recompression of the matrices in order to guarantee that the matrices remain efficient. This is implemented in src/compression.jl via the recompress! routine.

hssA = recompress!(hssA, atol=1e-3, rtol=1e-3)

All compression is handled in the sense that individual HSS block rows and columns approximate the original matrix A such that the tolerance is below atol of rtol for this block.

It can also be useful to construct HSS matrices from specific datastructures. For instance, we can construct an HSS matrix from a low-rank matrix in the following fashion:

U = randn(m, k); V = randn(n,k)
rcl = bisection_cluster(1:m, lsz)
ccl = bisection_cluster(1:n, lsz)
hssA = lowrank2hss(U, V, rcl, ccl)

Efficient HSS multiplication and division (inversion)

Of course we can now perform some arithmetic using HSS matrices. HssMatrices implements fast multiplication with dense and HSS matrices as well as fast solution of linear systems via the ULV factorization.

hssA*x
hssA\x

These operations will automatically call the right routines for fast multiplication and fast matrix division. Moreover, we can also use HSS arithmetic to multiply Hss matrices with eachother and use left/right division on them:

hssA+hssB
hssA-hssB
hssA*hssB
hssA\hssB
hssA/hssB

Do not forget to call recompression in order to keep the ranks low!

recompress!(hssA)

Convenience routines

We can also have a look at the generators and extract them via

U1, V2 = generators(hssA, (1,2))

Another important information is the maximum off-diagonal rank. We can compute it using

hssrank(hssA)

Alternatively, we can visualize the clustering and the off-diagonal ranks by calling

plotranks(hssA)

This should generate an image similar to the one seen at the top of the page.

Contribute

If you would like to contribute, start by looking at the issues and making a pull request. Writing unit tests and increasing code coverage is always a welcome contribution as well.

Acknowledgements

This library was inspired by the amazing package hm-toolbox by Stefano Massei, Leonardo Robol and Daniel Kressner. If you are using Matlab, I highly recommend to try this package.

In numerous occasions, members of the Julia Slack channel have helped me with the challenges of writing my first library in Julia. I would like to acknowledge their support.

hssmatrices.jl's People

Contributors

baptistelamic avatar bonevbs avatar github-actions[bot] avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

hssmatrices.jl's Issues

TagBot trigger issue

This issue is used to trigger TagBot; feel free to unsubscribe.

If you haven't already, you should update your TagBot.yml to include issue comment triggers.
Please see this post on Discourse for instructions and more details.

If you'd like for me to do this for you, comment TagBot fix on this issue.
I'll open a PR within a few hours, please be patient!

Positive Definitness

Hello Boris,

Does the approximation of a kernel matrix with hss guarantee to preserve positive definiteness? If yes, how does it achieve that?

Thank you!

Complex Linear Algebra

Hello Boris,

First of all thank you for making this package available. HSS-matrices are something that I for a long time have wanted to play around with. This package finally makes that possible 😄

My main application are acoustical problems. Here we mostly deal with complex matrices. Do you believe it would be possible to use this package for complex linear algebra?

Even some simple linear algebra such as multiplication seem to initially cause some problems

julia> n = 200
julia> A = rand(ComplexF64,n,n)
julia> hssA = hss(A)
julia> hssA*ones(n)
ERROR: MethodError: no method matching _matmatdown!(::Matrix{Float64}, ::HssMatrix{ComplexF64}, ::Matrix{Float64}, ::BinaryNode{Matrix{ComplexF64}}, ::Nothing, ::Float64, ::Float64)
Closest candidates are:
  _matmatdown!(::AbstractMatrix{T}, ::HssMatrix{T}, ::AbstractMatrix{T}, ::BinaryNode{AT}, ::Union{Nothing, AbstractMatrix{T}}, ::Number, ::Number) where {T, AT<:(AbstractArray{T, N} where N)} at /home/mpasc/.julia/packages/HssMatrices/1ZvgD/src/matmul.jl:44
  _matmatdown!(::HssMatrix{T}, ::HssMatrix{T}, ::BinaryNode{Matrix{T}}, ::Matrix{T}) where T at /home/mpasc/.julia/packages/HssMatrices/1ZvgD/src/matmul.jl:96
Stacktrace:
 [1] mul!(C::Matrix{Float64}, hssA::HssMatrix{ComplexF64}, B::Matrix{Float64}, α::Float64, β::Float64)
   @ HssMatrices ~/.julia/packages/HssMatrices/1ZvgD/src/matmul.jl:26
 [2] *
   @ ~/.julia/packages/HssMatrices/1ZvgD/src/matmul.jl:13 [inlined]
 [3] *(hssA::HssMatrix{ComplexF64}, x::Vector{Float64})
   @ HssMatrices ~/.julia/packages/HssMatrices/1ZvgD/src/matmul.jl:15
 [4] top-level scope
   @ REPL[128]:1

From the error it is obvious that this can be solved by casting the vector elements to have the same time as the matrix elements

julia> x = hssA*ones(eltype(hssA),n)

Unfortunately things go wrong when I try to solve the system

julia> hssA\x
ERROR: MethodError: no method matching lu!(::HssMatrix{ComplexF64}, ::Val{true}; check=true)
Closest candidates are:
  lu!(::StridedMatrix{T}, ::Union{Val{false}, Val{true}}; check) where T<:Union{Float32, Float64, ComplexF32, ComplexF64} at /builddir/build/BUILD/julia-1.6.1/build/usr/share/julia/stdlib/v1.6/LinearAlgebra/src/lu.jl:79
  lu!(::Union{Hermitian{T, S}, Symmetric{T, S}} where {T, S}, ::Union{Val{false}, Val{true}}; check) at /builddir/build/BUILD/julia-1.6.1/build/usr/share/julia/stdlib/v1.6/LinearAlgebra/src/lu.jl:88
  lu!(::StridedMatrix{T} where T, ::Union{Val{false}, Val{true}}; check) at /builddir/build/BUILD/julia-1.6.1/build/usr/share/julia/stdlib/v1.6/LinearAlgebra/src/lu.jl:130
  ...
Stacktrace:
 [1] lu(A::HssMatrix{ComplexF64}, pivot::Val{true}; check::Bool)
   @ LinearAlgebra /builddir/build/BUILD/julia-1.6.1/build/usr/share/julia/stdlib/v1.6/LinearAlgebra/src/lu.jl:273
 [2] lu(A::HssMatrix{ComplexF64}, pivot::Val{true}) (repeats 2 times)
   @ LinearAlgebra /builddir/build/BUILD/julia-1.6.1/build/usr/share/julia/stdlib/v1.6/LinearAlgebra/src/lu.jl:272
 [3] \(A::HssMatrix{ComplexF64}, B::Vector{ComplexF64})
   @ LinearAlgebra /builddir/build/BUILD/julia-1.6.1/build/usr/share/julia/stdlib/v1.6/LinearAlgebra/src/generic.jl:1136
 [4] top-level scope
   @ REPL[131]:1

I have not been able to find a easy work-around (not that I have spend any reasonable time on it ). Is this something do you believe could be easily fixed?

Cheers,
Mikkel

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.