Giter VIP home page Giter VIP logo

Comments (39)

stevengj avatar stevengj commented on August 17, 2024

Similarly for types, like your Krylov subspace type.

from iterativesolvers.jl.

ViralBShah avatar ViralBShah commented on August 17, 2024

Yes, these should be untyped so that they can allow anything with * through duck typing. It should also allow for passing in a closure f(x) that evaluates A*x in its own way.

from iterativesolvers.jl.

stevengj avatar stevengj commented on August 17, 2024

It's not clear to me that we need to support passing a function x -> A*x, since the user can always define a new type and write an arbitrary function * for that type. And the code will be cleaner if we exclusively write in terms of A*x.

from iterativesolvers.jl.

stevengj avatar stevengj commented on August 17, 2024

On the other hand, closures are cleaner for implementing things like shifts, since you can just do x -> A*x - µ*x without modifying A or going to the trouble of defining a new type.

One possibility is to write everything in terms of closures as the "low-level" interface, but have a higher-level interface with duck-typed matrices, of the form:

eigs(f::Function, startingvector::AbstractVector, .....) = ....
eigs(A, startingvector=rand(eltype(A), size(A,2)), ...) = eigs(x -> A*x, startingvector, ...)

from iterativesolvers.jl.

stevengj avatar stevengj commented on August 17, 2024

There is also the question of memory allocation: it might be nice to support functions of the form (x,y) -> y[:] = A*x in order to avoid memory allocation of result vectors.

from iterativesolvers.jl.

ViralBShah avatar ViralBShah commented on August 17, 2024

That is what I ended up settling on inside the current eigs implementation - writing everything in terms of closures and having higher level interfaces with AbstractMatrix.

from iterativesolvers.jl.

stevengj avatar stevengj commented on August 17, 2024

Why is the higher-level interface based on AbstractMatrix rather than duck-typing?

from iterativesolvers.jl.

ViralBShah avatar ViralBShah commented on August 17, 2024

That is the higher level interface. eigs takes any A, which may or may not be an AbstractMatrix and passes it to the ARPACK wrappers as x->A*x.

from iterativesolvers.jl.

stevengj avatar stevengj commented on August 17, 2024

Viral, x->A*x is not duck typing. I have to repeat my question: why does the higher-level interface have a type declaration at all (as opposed to untyped arguments with duck typing)? You wouldn't lose any functionality: you could still pass AbstractMatrix, but you would gain the ability to pass other types supporting *. This wouldn't conflict with the low-level interface, since the low-level interface would explicitly take a Function argument.

from iterativesolvers.jl.

ViralBShah avatar ViralBShah commented on August 17, 2024

There are two things that I would love for the iterative solvers to support:

  1. Pass in an AbstractMatrix, and it just works with A*x due to duck typing.
  2. Pass in a linear operator as a function, which is called as linop(x).

One way to achieve this is to have:

# Implement the method for linear operators
itermethod(linop::Function, args...)

# Handle AbstractMatrix this way
itermethod(A::AbstractMatrix, args...) = itermethod(x->A*x, args)

# For the case where the user has a different type that defines `*`, the user creates the closure and passes it as a linear operator, but duck typing makes the matrix-vector product work.
itermethod(x->A*x, args)

This is the approach I have currently taken in the eigs interface.

from iterativesolvers.jl.

stevengj avatar stevengj commented on August 17, 2024

Viral, why not just have

itermethod(A, args...) = itermethod(x->A*x, args)

This way, you support any type that defines *, not just AbstractMatrix (e.g. factorization objects). This is the point I've been making repeatedly, which you have been ignoring....I feel like I'm talking past you.

from iterativesolvers.jl.

ViralBShah avatar ViralBShah commented on August 17, 2024

It seems that I have been talking past you too, or I am not sure what I am missing. There are often cases, where people just want to apply an operator by just calling a function. Your suggestion requires a user to always define a new type with * overloaded, and takes away the ability to just pass a function, which may not necessarily. That seems like it is more work than ought to be necessary.

Let me provide a concrete example. See the applyMoler example in the matlab documentation: http://www.mathworks.in/help/matlab/ref/pcg.html

It is nice to be able to just pass applyMoler from the example to the solver, rather than create MolerType and then overload *. My proposal above tries to accommodate this kind of usage.

from iterativesolvers.jl.

stevengj avatar stevengj commented on August 17, 2024

No, no, no, no. My suggestion is to define:

itermethod(linop::Function, args...)
itermethod(A, args...) = itermethod(x->A*x, args)

It is exactly the same as your proposal, except with the type declaration removed in the second method. It gives more functionality with less code.

from iterativesolvers.jl.

ViralBShah avatar ViralBShah commented on August 17, 2024

My understanding was that you were proposing not having the first definition at all. This clarifies and does give more functionality with less code.

from iterativesolvers.jl.

timholy avatar timholy commented on August 17, 2024

For most of these you need two linear operators, one to compute A*x and another to compute A'*x.

from iterativesolvers.jl.

stevengj avatar stevengj commented on August 17, 2024

@timholy, what do you mean? The most popular iterative algorithms for non-symmetric problems (GMRES, BiCGSTAB, Arnoldi, Jacobi-Davidson), only require A*x. Which one did you have in mind?

from iterativesolvers.jl.

timholy avatar timholy commented on August 17, 2024

I didn't realize that. I guess I usually work with overdetermined least-squares problems, and those algorithms do require both.

from iterativesolvers.jl.

stevengj avatar stevengj commented on August 17, 2024

Ah, right, I was thinking of square-matrix problems; Golub–Kahn-like algorithms like LSMR for non-square problems require A and A', and for those algorithms the Function interface would presumably take two Functions as arguments (though I don't really know much about the state of the art in iterative least-squares solvers).

from iterativesolvers.jl.

timholy avatar timholy commented on August 17, 2024

As far as I can tell from the literature, for most applications LSMR isn't obviously better than LSQR, but eventually we should have both (I already have LSQR).

Does this fact change thinking about the API? I don't think it does, since to users it would emphasize the need to implement both, but just worth checking.

from iterativesolvers.jl.

stevengj avatar stevengj commented on August 17, 2024

Sigh. As explained in #7, @timholy, even if it is not appropriate for a given solver to support a Function interface, that is still not an argument for restricting the arguments to AbstractMatrix subtypes.

Why is duck-typing so hard to understand?

from iterativesolvers.jl.

timholy avatar timholy commented on August 17, 2024

It's not hard to understand, I just disagree with you.

from iterativesolvers.jl.

timholy avatar timholy commented on August 17, 2024

I guess out of fairness I should elaborate on the latter a bit more. In security/content-filtering, there are two approaches: whitelisting and blacklisting. Both have their places. Let's think about these in relation to Julia code. When somebody asks me to "turn on" access to a particular function for a particular type, it's pretty easy to "whitelist" via a Union or whatever. But in Julia it's really hard (or in some cases impossible) to blacklist. Hence I am more cautious about my types than you.

from iterativesolvers.jl.

stevengj avatar stevengj commented on August 17, 2024

@timholy, no one is arguing that duck typing should always be used in all contexts. The question is, should it be used in this context?

In this context, there are obvious benefits to duck-typing, because many many objects can represent linear operators that are not array-like container types (and in fact may not provide efficient random access to the matrix elements at all), and in fact Julia already includes such objects (e.g. Factorization objects). By typing, you lose access to these types without having to jump through hoops (defining a function or MatrixFcn or whatever wrapper).

In this context, I don't see any obvious drawbacks to duck-typing, nor have you or Viral made any specific arguments to that effect that I can see.

from iterativesolvers.jl.

stevengj avatar stevengj commented on August 17, 2024

Just to be clear, everyone now agrees that the iterative methods in this package should apply to arbitrary duck-typed arguments A supporting the requisite operations, rather than A::AbstractMatrix types?

from iterativesolvers.jl.

jiahao avatar jiahao commented on August 17, 2024

I will say that I when I wrote the initial code, I thought that AbstractMatrix was anything that supported multiplication (and possibly backslash) on vectors, i.e. it was a black box that generated vectors. Now I'm just really confused as to what AbstractMatrix really is. Grepping the current base code turns up only particular matrix factorizations and matrices as subtypes of AbstractMatrix.

from iterativesolvers.jl.

stevengj avatar stevengj commented on August 17, 2024

Factorization types are not AbstractMatrix types:

julia> isa(Factorization, AbstractMatrix)
false

julia> isa(SymTridiagonal, AbstractMatrix)
false

Correction: Factorization <: AbstractMatrix is false, but SymTridiagonal <: AbstractMatrix is true.

An AbstractMatrix is simply a synonym for an AbstractArray of dimension 2. I think the principle is that an AbstractArray is an array-like container type with random access to its elements. Because of that, it is wrong to interpret the AbstractMatrix type as a superclass for all finite-dimensional linear operators, because many linear operators are represented implicitly in such a way that there is no direct access to their elements (short of multiplying by a unit vector and taking a dot product with another unit vector), e.g. factorization objects.

Furthermore, remember that Julia does not support multiple inheritance, so it is not necessarily practical to require all types that represent linear operators to subclass AbstractMatrix.

from iterativesolvers.jl.

stevengj avatar stevengj commented on August 17, 2024

(If you define an AbstractMatrix type and don't overload getindex, for example, simply printing it will throw an exception because print_matrix expects access to the elements!)

from iterativesolvers.jl.

jiahao avatar jiahao commented on August 17, 2024

I didn't meant to say to say that Factorizations are subtypes of AbstractMatrix, but rather that certain things in factorization.jl like QRPackedQ are.

Neither did I disagree with isa(SymTridiagonal, AbstractMatrix)==false. What I said was

julia> SymTridiagonal <: AbstractMatrix
true

However, I will agree that AbstractMatrix, as a container with two indices that allow for random access, it isn't general enough to describe arbitrary finite-dimensional linear operators.

from iterativesolvers.jl.

stevengj avatar stevengj commented on August 17, 2024

Whoops, sorry, you're right that I should have used <: and not isa.

from iterativesolvers.jl.

stevengj avatar stevengj commented on August 17, 2024

So, does everyone now agree that the iterative methods in this package should apply to arbitrary duck-typed arguments A supporting the requisite operations, rather than A::AbstractMatrix types?

from iterativesolvers.jl.

jiahao avatar jiahao commented on August 17, 2024

I will agree that there is no single type that currently exists that covers the full abstraction of LinearOperator, i.e. of arbitrary mappings of finite-dimensional vector spaces onto themselves, and that Union(Function,AbstractMatrix) would be a very unsatisfying thing to do. So until such time when a satisfactory LinearOperator comes into existence, duck-typing seems like the best we can do.

from iterativesolvers.jl.

ViralBShah avatar ViralBShah commented on August 17, 2024

I am fully onboard with duck typing as the solution. It seems to be more general than the LinearOperator example, as it requires the least amount of effort for the user. Either they provide a type with * defined or a function that is a linear operator, and they do not have to do anything else.

from iterativesolvers.jl.

stevengj avatar stevengj commented on August 17, 2024

@jiahao, even if LinearOperator comes into existence,we still will need to duck-type because Julia doesn't support multiple inheritance. (commit cb7c7eb does not close this issue because many other routines in this package are still overtyped.)

from iterativesolvers.jl.

jiahao avatar jiahao commented on August 17, 2024

I tried to do this for as many routines as I could manage in the follow-up commit (87635c4). Perhaps you can check to see if I hadn't missed any. I think the only remaining routines that need to be fixed are svdvals_gkl and the newly-merged gmres.

from iterativesolvers.jl.

stevengj avatar stevengj commented on August 17, 2024

Sorry, I missed that commit!

Regarding svdvals_gkl, you should understand that duck-typing is simply the trivial change of deleting the ::AbstractMatrix{T} type declaration of the argument. It is entirely separate from the question of whether you want to provide some kind of Function interface (which is tricky because of the need for A').

from iterativesolvers.jl.

stevengj avatar stevengj commented on August 17, 2024

Because of krylov.jl line 28, it seems solvers like cg will still only work for AbstractMatrix types. Instead, you can use eltype(A) or (probably better) eltype(b) (or better yet, a floating-point promotion thereof, since otherwise it will fail if you pass an integer matrix) to get vector type of the Krylov space.

from iterativesolvers.jl.

Jutho avatar Jutho commented on August 17, 2024

Dear all,

It is strange that I missed the startup of this package. I am very interested in this and potentially willing to contribute. I am not that experienced, although I did (carefully) read large parts of Saad and of these lecture notes (http://people.inf.ethz.ch/arbenz/ewp/Lnotes/lsevp.pdf) recently.

Regarding the typing issue, I was involved in the AbstractMatrix versus duck typing discussion a while ago and was in the end convinced that duck typing is the way to go. I do have one suggestion that might be relevant if the plan is to also provide high-level user functions (e.g. similar to eigs) that try to select the best low-level solver based on the element type, the symmetry/hermiticity/positive definiteness and other properties of the linear operator? It could be OK to ask a user to overload *(A::UserType, v::Vector) but not that he also overloads all these other methods. Even for the built in AbstractMatrix types, issym etc do not always produce the results that a user might expect (i.e. it checks element wise for exact equality, without taking floating point precision into account).

However, having to specify all these properties via arguments for every solver (which is what Matlab does for eigs) also doesn't sound like a good programming strategy. While I was philosophizing about a Krylov-type package, I figured that it might be a good idea to have a wrapper type LinearOperator that accepts a general A (duck typed) and a list of arguments via which the user can specify the properties of the linear operator. This way, the actual solver methods would just call eltype, issym, isherm, isposdef, size, A*v, etc and the code would look very readable and intuitive. These high-lever solver methods should also be duck typed and can then be used with any of the following:

  • any subtype of AbstractMatrix
  • any user type that overloads the aforementioned methods
  • any user type / function that is wrapped in a LinearOperator and where the user explicitly provides the properties via arguments.

from iterativesolvers.jl.

haampie avatar haampie commented on August 17, 2024

This issue is also resolved by compatibility for LinearMaps

from iterativesolvers.jl.

Jutho avatar Jutho commented on August 17, 2024

:-). I didn't even remember this one. I guess this was before (and why) I started LinearMaps.jl

from iterativesolvers.jl.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.