Giter VIP home page Giter VIP logo

Comments (13)

jiahao avatar jiahao commented on July 18, 2024

Yes I think it would be the right thing to do to allow b as a matrix, so that the Krylov methods we have will generalize automatically to block Krylov methods. The main thing I don't know how to do is what termination criteria to write down. It's tempting to simply replace all the norms with matrix norms, but I'm not sure if that's the right thing to do.

from iterativesolvers.jl.

ChrisRackauckas avatar ChrisRackauckas commented on July 18, 2024

It's tempting to simply replace all the norms with matrix norms, but I'm not sure if that's the right thing to do.

Note that if this is done, we may need to require packages like GenericSVD.jl in order to support number types like BigFloat since the SVD is used for the 2-norm and no generic fallback is given in Base.

from iterativesolvers.jl.

rveltz avatar rveltz commented on July 18, 2024

Hi,

Has anyone made progress on this issue? In my case, I need to solve a 2d PDE with matrix-free method. I represent the solution as a matrix. The Frobenium norm would do for the stopping criterion.

from iterativesolvers.jl.

haampie avatar haampie commented on July 18, 2024

Just to be sure, I think what you mean is support for custom vector types, right? Block methods are about solving a system AX=B for multiple right-hand sides, in the sense that you solve AX[:, i] = B[:, i] for all i's independently. This is not the same as representing the unknowns in a shape similar to your geometry (then you still have a single rhs).

In principle the iterative method is just unaware about the geometry of your domain, so you could just do the transformation from vector -> your internal geometric representation of the unknowns before applying your matrix-free A, and then transforming it back again to vector. And in fact, this should be a no-op, since the underlying data structure should be the same. It's just reinterpreting a vector.

from iterativesolvers.jl.

rveltz avatar rveltz commented on July 18, 2024

You are right, my 2d-array is like a vector. It is not meant for solving block systems AX[:, i] = B[:, i].

I though about using reshape but fear about performance loss. This is also not very convenient if the data is hold on the GPU and no good reshape method is available.

For example, if I take A = x-> fft(x,2), how can I use solve Ax=y using your package?

from iterativesolvers.jl.

andreasnoack avatar andreasnoack commented on July 18, 2024

It's a general problem. See also JuliaLang/julia#25565. In Julia, we have generally associated vector with 1d arrays and linear maps with 2d arrays but, as you mention, it doesn't have to be like that. It just makes things simpler. It looks like people would like a broader vector definition to happen for dot/inner and norm but next up is axpy!, and should we generally allow A*x as long as size(A, 2) == length(x) even though x is 2d or 3d.

from iterativesolvers.jl.

haampie avatar haampie commented on July 18, 2024

I'm on Julia 0.7 right now, so I can't really test this approximately working Julia 0.6 code:

import Base: A_mul_B!

struct MyLinearOp
  n::Int
end

function A_mul_B!(y, A::MyLinearOp, x)
  fft!(reshape(copy!(y, x), A.n, A.n), 2) # copy x -> y and reshape to a matrix, then do fft; reshape shouldn't copy iirc
  y # return the original y, not the reshaped reference to it
end

function example(n = 128)
  A = MyLinearOp(n)
  x_square = rand(Complex128, n, n)
  y_square = similar(x)
  A_mul_B!(y_square, A, x_square) # should work

  x_linear = rand(Complex128, n * n)
  y_linear = similar(x_linear)
  A_mul_B!(y_linear, A, x_linear) # should work

  # then just call
  b = ... # some vector
  x = gmres(A, b)
  # and transform x back to your matrix-like representation.
end

But it might complain you have to implement one or two things, such as eltype, size etc. A better way to work with matrix-free methods is to use https://github.com/Jutho/LinearMaps.jl

from iterativesolvers.jl.

rveltz avatar rveltz commented on July 18, 2024

Thank you for your help!!

With LinearMaps, I dont think it is possible without the reshape because the constructor requires the input size N to be an Int and not a tuple for example.

LinearMap{T=Float64}(f, [fc,], M::Int, N::Int = M; kwargs...)

Basically, my question is "Can we solve Ax = y" without reshaping and y being a 2d array?

from iterativesolvers.jl.

rveltz avatar rveltz commented on July 18, 2024

Sorry, I think I got it now...

from iterativesolvers.jl.

ChrisRackauckas avatar ChrisRackauckas commented on July 18, 2024

Basically, my question is "Can we solve Ax = y" without reshaping and y being a 2d array?

reshaping is just a view so it's free.

from iterativesolvers.jl.

dkarrasch avatar dkarrasch commented on July 18, 2024

Indeed. There is a related example here at LinearMaps.jl, which works just fine on Julia 0.7. I hadn't done any performance testing though. When using LinearMaps with the FunctionMap type, you should get full compliance with IterativeSolvers automatically, no need to redefine LinearAlgebra methods etc.

from iterativesolvers.jl.

dkarrasch avatar dkarrasch commented on July 18, 2024

Late to the party, but here is some code for testing:

using LinearAlgebra, LinearMaps, BenchmarkTools

# with the following tweak, one may define multiplication for arrays
function LinearAlgebra.mul!(y::AbstractMatrix, A::LinearMaps.FunctionMap, x::AbstractMatrix)
    (length(x) == A.N && length(y) == A.M) || throw(DimensionMismatch())
    LinearMaps.ismutating(A) ? A.f(y, x) : copyto!(y, A.f(x))
    return y
end

using DSP, FFTW

n = 1024;
A = LinearMap((y, x) -> (FFTW.fft!(reshape(copyto!(y, x), n, n), 2); y),
              (y, x) -> (FFTW.ifft!(reshape(copyto!(y, x), n, n), 2); y),
              n*n)
B = LinearMap((y, x) -> FFTW.fft!(copyto!(y, x), 2),
              (y, x) -> FFTW.ifft!(copyto!(y, x), 2),
              n*n)

x = rand(ComplexF64, n*n);
y = similar(x);
@benchmark mul!($y, $A, $x)
x = rand(ComplexF64, n, n);
y = similar(x);
@benchmark mul!($y, $B, $x)

This yields

julia> @benchmark mul!($y, $A, $x)
BenchmarkTools.Trial: 
  memory estimate:  2.83 KiB
  allocs estimate:  45
  --------------
  minimum time:     17.217 ms (0.00% GC)
  median time:      17.857 ms (0.00% GC)
  mean time:        18.095 ms (0.00% GC)
  maximum time:     24.932 ms (0.00% GC)
  --------------
  samples:          277
  evals/sample:     1

for the first benchmark, and

julia> @benchmark mul!($y, $B, $x)
BenchmarkTools.Trial: 
  memory estimate:  2.83 KiB
  allocs estimate:  45
  --------------
  minimum time:     17.178 ms (0.00% GC)
  median time:      18.130 ms (0.00% GC)
  mean time:        18.334 ms (0.00% GC)
  maximum time:     25.287 ms (0.00% GC)
  --------------
  samples:          273
  evals/sample:     1

for the second. Doesn't seem to be so bad.

For when mul!(Y, A, X) is meant to apply A columnwise to X::AbstractMatrix, there is a corresponding function mul! in LinearMaps.jl.

from iterativesolvers.jl.

rveltz avatar rveltz commented on July 18, 2024

thank you!! Sorry for the late reply but this does not seem to work

sol = IterativeSolvers.gmres(B, x)

from iterativesolvers.jl.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.