Giter VIP home page Giter VIP logo

diffractor.jl's Introduction

Diffractor - Next Generation AD

Build Status Coverage

Docs:

General Overview

Diffractor is an experimental next-generation, compiler-based AD system for Julia.

Design goals:

  • Ultra high performance for both scalar and array code
  • Efficient higher order derivatives through nested AD
  • Reasonable compile times
  • High flexibility (like Zygote)
  • Support for forward/reverse/mixed modes
  • Fast Jacobians

This is achieved through a combination of innovations:

  • A new lowest level interface (∂⃖ the "AD optic functor" or "diffractor"), more suited to higher order AD
  • New capabilities in Base Julia (Opaque closures, inference plugins)
  • Better integration with ChainRules.jl
  • Demand-driven forward-mode AD (Applying transforms to only those IR statements that contribute to relevant outputs of the function being differentiated)

Current status

Diffractor is currently supported on Julia v1.10+. While the best performance is generally achieved by running on Julia nightly due to constant compiler improvements, the current release of Diffractor is guaranteed to work on Julia v1.10.

Current Status: Forward-Mode

Currently, forward-mode is the only fully-functional mode and is now shipping in some closed source products. It is in a position to compete with ForwardDiff.jl, and with TaylorDiff.jl. It is not as battle-tested as ForwardDiff.jl, but it has several advantages: Primarily, as it is not an operator overloading AD, it frees one from the need to relax type-constants and worry about the types of containers. Furthermore, Like TaylorDiff.jl, it supports Taylor series based computation of higher order derviatives. It directly and efficiently uses ChainRules.jl's frules, no need for a wrapper macro to import them etc.

One limitation over ForwardDiff.jl is a lack of chunking support, to pushforward multiple bases at once.

Current Status: Reverse-Mode

Improved reverse mode support is planned for a future release. While reverse mode was originally implemented and working, it has been stripped out until such a time as it can be properly implemented on top of new Julia compiler changes.
⚠️ Reverse Mode support should be considered experimental, and may break without warning, and may not be fixed rapidly. ⚠️

With that said, issues and PRs for reverse mode continue to be appreciated.

Status as of last time reverse mode was worked on:

The plan is to implement this in two stages:

  1. Generated function based transforms, using the ChainRules, the new low level interface and Opaque closures
  2. Adding inference plugins

Currently the implementation of Phase 1 is essentially complete, though mostly untested. Experimentation is welcome, though it is probably not ready yet to be a production AD system. The compiler parts of phase 1 are a bit "quick and dirty" as the main point of phase 1 is to prove out that the overall scheme works. As a result, it has known suboptimalities. I do not intend to do much work on these, since they will be obsoleted by phase 2 anyway.

A few features are still missing, e.g. chunking and I intend to do some more work on user friendly interfaces, but it should overall be useable as an AD system.

diffractor.jl's People

Contributors

aviatesk avatar azamatb avatar dependabot[bot] avatar gdalle avatar giggleliu avatar github-actions[bot] avatar jameswrigley avatar jw3126 avatar keno avatar kristofferc avatar masonprotter avatar mattiasvillani avatar mcabbott avatar michel2323 avatar niklasschmitz avatar nmheim avatar oscardssmith avatar oxinabox avatar pepijndevos avatar ranocha avatar rikhuijzer avatar samuela avatar simeonschaub avatar staticfloat avatar topolarity avatar viralbshah avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

diffractor.jl's Issues

Cannot compute the second derivative of a generic function via forward mode AD

MWE:

julia> foo(x) = cos(x) * sin(x)
foo (generic function with 1 method)

julia> const var"'" = Diffractor.PrimeDerivativeFwd
Diffractor.PrimeDerivativeFwd

julia> @time foo''(1.0)
ERROR: MethodError: no method matching apply_type(::Type{Base.Broadcast.Broadcasted}, ::Type{Base.Broadcast.DefaultArrayStyle{0}}, ::Type{Nothing}, ::Type{typeof(*)}, ::Type{Tuple{Float64, Float64}})
Stacktrace:
  [1] ∂☆nomethd(args::Any)
    @ Diffractor ~/src/julia/Diffractor/src/stage1/recurse_fwd.jl:115
  [2] macro expansion
    @ ~/src/julia/Diffractor/src/stage1/recurse_fwd.jl:0 [inlined]
  [3] (::Diffractor.∂☆recurse{1})(::Diffractor.UniformBu, ::Diffractor.UniformBu, ::Diffractor.UniformBu, ::Diffractor.UniformBu, ::Diffractor.UniformBu, ::Diffractor.UniformBu)
    @ Diffractor ~/src/julia/Diffractor/src/stage1/recurse_fwd.jl:160

Diffractor forgets contents of structure?

I've been having an issue with Diffractor with TransformVariables.jl.
I have isolated the issue to this function. Diffractor it seems to forget the contents of the structure. Any idea what might be going wrong?

using Diffractor
@inline UNPACK(x, ::Val{f}) where {f} = getproperty(x, f)
@inline UNPACK(x::AbstractDict{Symbol}, ::Val{k}) where {k} = x[k]
@inline UNPACK(x::AbstractDict{<:AbstractString}, ::Val{k}) where {k} = x[string(k)]

macro UNPACK(args)
    args.head!=:(=) && error("Expression needs to be of form `a, b = c`")
    items, suitecase = args.args
    items = isa(items, Symbol) ? [items] : items.args
    suitecase_instance = gensym()
    kd = [:( $key = $UNPACK($suitecase_instance, Val{$(Expr(:quote, key))}()) ) for key in items]
    kdblock = Expr(:block, kd...)
    expr = quote
        local $suitecase_instance = $suitecase # handles if suitecase is not a variable but an expression
        $kdblock
        $suitecase_instance # return RHS of `=` as standard in Julia
    end
    esc(expr)
end

function er(tranforms)
    @UNPACK transformations = tranforms
    transformations
end
struct tr
    transformations
end
er(tr(1)) #Works
Diffractor.∂⃖recurse{1}()(er,tr(1)) #Doesn't Work

Gradients with respect to dictionaries don't work

In Julia 1.7.3:

julia> Zygote.gradient(x -> x["foo"]["bar"]^2, Dict("foo" => Dict("bar" => 5)))
(Dict{Any, Any}("foo" => Dict{Any, Any}("bar" => 10)),)

julia> Diffractor.gradient(x -> x["foo"]["bar"]^2, Dict("foo" => Dict("bar" => 5)))
ERROR: ArgumentError: Tangent for the primal Dict{String, Int64} should be backed by a AbstractDict type, not by NamedTuple{(:vals,), Tuple{Vector{Int64}}}.
Stacktrace:
 [1] _backing_error(P::Type, G::Type, E::Type)
   @ ChainRulesCore C:\Users\anhin\.julia\packages\ChainRulesCore\ctmSK\src\tangent_types\tangent.jl:62
 [2] ChainRulesCore.Tangent{Dict{String, Int64}, NamedTuple{(:vals,), Tuple{Vector{Int64}}}}(backing::NamedTuple{(:vals,), Tuple{Vector{Int64}}})
   @ ChainRulesCore C:\Users\anhin\.julia\packages\ChainRulesCore\ctmSK\src\tangent_types\tangent.jl:33
 [3] (::Diffractor.var"#162#164"{Symbol, DataType})(Δ::Vector{Int64})
   @ Diffractor C:\Users\anhin\.julia\packages\Diffractor\WrKGJ\src\stage1\generated.jl:309
 [4] (::Diffractor.EvenOddOdd{1, 1, Diffractor.var"#162#164"{Symbol, DataType}, Diffractor.var"#163#165"{Symbol}})(Δ::Vector{Int64})
   @ Diffractor C:\Users\anhin\.julia\packages\Diffractor\WrKGJ\src\stage1\generated.jl:289
 [5] ∂⃖¹₁getproperty
   @ .\none:1

and

julia> pars = Dict(:x=>0f0, "y"=>4f0, 8=>-3f0)
Dict{Any, Float32} with 3 entries:
  "y" => 4.0
  8   => -3.0
  :x  => 0.0

julia> function my_map(my_f, my_dict)
                  new_dict = Dict()
                  for k in keys(my_dict)
                      new_dict[k] = my_f(my_dict[k])
                  end
                  new_dict
              end
my_map (generic function with 1 method)

julia> function my_sum(my_dict)
                  s = 0f0
                  for k in keys(my_dict)
                      s += my_dict[k]
                  end
                  s
              end
my_sum (generic function with 1 method)

julia> my_sum(my_map(x->x^2, pars))
25.0f0

julia> Zygote.gradient(pars -> my_sum(my_map(x->x^2, pars)), pars)
(Dict{Any, Any}("y" => 8.0f0, 8 => -6.0f0, :x => 0.0f0),)

julia> Diffractor.gradient(pars -> my_sum(my_map(x->x^2, pars)), pars)
ERROR: MethodError: no method matching (::Diffractor.∂⃖recurse{1})(::typeof(Core.sizeof), ::Int64)
Closest candidates are:
  (::Diffractor.∂⃖recurse)(::Any...) at C:\Users\anhin\.julia\packages\Diffractor\WrKGJ\src\stage1\generated.jl:406
Stacktrace:
  [1] macro expansion
    @ C:\Users\anhin\.julia\packages\Diffractor\WrKGJ\src\stage1\generated.jl:0 [inlined]
  [2] (::Diffractor.∂⃖recurse{1})(::typeof(Core.sizeof), ::Int64)
    @ Diffractor C:\Users\anhin\.julia\packages\Diffractor\WrKGJ\src\stage1\generated.jl:415
  [3] (::∂⃖{1})(f::typeof(Core.sizeof), args::Int64)
    @ Diffractor C:\Users\anhin\.julia\packages\Diffractor\WrKGJ\src\stage1\generated.jl:216
  [4] is_top_bit_set
    @ .\boot.jl:616 [inlined]
  [5] (::∂⃖{1})(f::typeof(Core.is_top_bit_set), args::Int64)
    @ Diffractor C:\Users\anhin\.julia\packages\Diffractor\WrKGJ\src\stage1\generated.jl:216
  [6] check_top_bit
    @ .\boot.jl:626 [inlined]
  [7] (::Diffractor.∂⃖recurse{1})(::typeof(Core.check_top_bit), ::Type{UInt64}, ::Int64)
    @ Diffractor C:\Users\anhin\.julia\packages\Diffractor\WrKGJ\src\stage1\generated.jl:0
  [8] (::∂⃖{1})(::typeof(Core.check_top_bit), ::Type, ::Vararg{Any})
    @ Diffractor C:\Users\anhin\.julia\packages\Diffractor\WrKGJ\src\stage1\generated.jl:216
  [9] toUInt64
    @ .\boot.jl:737 [inlined]
 [10] (::Diffractor.∂⃖recurse{1})(::typeof(Core.toUInt64), ::Int64)
    @ Diffractor C:\Users\anhin\.julia\packages\Diffractor\WrKGJ\src\stage1\generated.jl:0
 [11] (::∂⃖{1})(f::typeof(Core.toUInt64), args::Int64)
    @ Diffractor C:\Users\anhin\.julia\packages\Diffractor\WrKGJ\src\stage1\generated.jl:216
 [12] UInt64
    @ .\boot.jl:767 [inlined]
 [13] (::Diffractor.∂⃖recurse{1})(::Type{UInt64}, ::Int64)
    @ Diffractor C:\Users\anhin\.julia\packages\Diffractor\WrKGJ\src\stage1\generated.jl:0
 [14] (::∂⃖{1})(f::Type{UInt64}, args::Int64)
    @ Diffractor C:\Users\anhin\.julia\packages\Diffractor\WrKGJ\src\stage1\generated.jl:216
 [15] convert
    @ .\number.jl:7 [inlined]
 [16] (::Diffractor.∂⃖recurse{1})(::typeof(convert), ::Type{UInt64}, ::Int64)
    @ Diffractor C:\Users\anhin\.julia\packages\Diffractor\WrKGJ\src\stage1\generated.jl:0
 [17] (::∂⃖{1})(::typeof(convert), ::Type, ::Vararg{Any})
    @ Diffractor C:\Users\anhin\.julia\packages\Diffractor\WrKGJ\src\stage1\generated.jl:216
 [18] Dict
    @ .\dict.jl:90 [inlined]
 [19] (::Diffractor.∂⃖recurse{1})(args::Type{Dict{Any, Any}})
    @ Diffractor C:\Users\anhin\.julia\packages\Diffractor\WrKGJ\src\stage1\generated.jl:0
 [20] (::∂⃖{1})(::Type{Dict{Any, Any}})
    @ Diffractor C:\Users\anhin\.julia\packages\Diffractor\WrKGJ\src\stage1\generated.jl:216
 [21] Dict
    @ .\dict.jl:118 [inlined]
 [22] (::Diffractor.∂⃖recurse{1})(args::Type{Dict})
    @ Diffractor C:\Users\anhin\.julia\packages\Diffractor\WrKGJ\src\stage1\generated.jl:0
 [23] (::∂⃖{1})(::Type{Dict})
    @ Diffractor C:\Users\anhin\.julia\packages\Diffractor\WrKGJ\src\stage1\generated.jl:216
 [24] my_map
    @ .\REPL[20]:2 [inlined]
 [25] (::Diffractor.∂⃖recurse{1})(::typeof(my_map), ::var"#14#16", ::Dict{Any, Float32})
    @ Diffractor C:\Users\anhin\.julia\packages\Diffractor\WrKGJ\src\stage1\generated.jl:0
 [26] (::∂⃖{1})(::typeof(my_map), ::Function, ::Vararg{Any})
    @ Diffractor C:\Users\anhin\.julia\packages\Diffractor\WrKGJ\src\stage1\generated.jl:216
 [27] #13
    @ .\REPL[24]:1 [inlined]
 [28] (::Diffractor.∂⃖recurse{1})(::var"#13#15", ::Dict{Any, Float32})
    @ Diffractor C:\Users\anhin\.julia\packages\Diffractor\WrKGJ\src\stage1\generated.jl:0
 [29] (::∂⃖{1})(f::var"#13#15", args::Dict{Any, Float32})
    @ Diffractor C:\Users\anhin\.julia\packages\Diffractor\WrKGJ\src\stage1\generated.jl:216
 [30] ∂⃖(::Function, ::Vararg{Any})
    @ Diffractor C:\Users\anhin\.julia\packages\Diffractor\WrKGJ\src\interface.jl:25
 [31] (::Diffractor.∇{var"#13#15"})(args::Dict{Any, Float32})
    @ Diffractor C:\Users\anhin\.julia\packages\Diffractor\WrKGJ\src\interface.jl:123
 [32] Diffractor.∇(::Function, ::Dict{Any, Float32})
    @ Diffractor C:\Users\anhin\.julia\packages\Diffractor\WrKGJ\src\interface.jl:130
 [33] top-level scope
    @ REPL[24]:1

Precompile Fails v0.1.1

When trying to install Diffractor, I get the following error.
ERROR: LoadError: UndefVarError: add_inst! not defined Stacktrace: [1] getproperty(x::Module, f::Symbol) @ Base ./Base.jl:31 [2] top-level scope @ ~/.julia/packages/Diffractor/6BWpf/src/stage1/hacks.jl:6 [3] include(mod::Module, _path::String) @ Base ./Base.jl:419 [4] include(x::String) @ Diffractor ~/.julia/packages/Diffractor/6BWpf/src/Diffractor.jl:1 [5] top-level scope @ ~/.julia/packages/Diffractor/6BWpf/src/stage1/recurse.jl:111 [6] include(mod::Module, _path::String) @ Base ./Base.jl:419 [7] include(x::String) @ Diffractor ~/.julia/packages/Diffractor/6BWpf/src/Diffractor.jl:1 [8] top-level scope @ ~/.julia/packages/Diffractor/6BWpf/src/stage1/generated.jl:7 [9] include(mod::Module, _path::String) @ Base ./Base.jl:419 [10] include(x::String) @ Diffractor ~/.julia/packages/Diffractor/6BWpf/src/Diffractor.jl:1 [11] top-level scope @ ~/.julia/packages/Diffractor/6BWpf/src/Diffractor.jl:15 [12] include @ ./Base.jl:419 [inlined] [13] include_package_for_output(pkg::Base.PkgId, input::String, depot_path::Vector{String}, dl_load_path::Vector{String}, load_path::Vector{String}, concrete_deps::Vector{Pair{Base.PkgId, UInt64}}, source::Nothing) @ Base ./loading.jl:1554 [14] top-level scope @ stdin:1 in expression starting at /home/aragorn/.julia/packages/Diffractor/6BWpf/src/stage1/hacks.jl:6 in expression starting at /home/aragorn/.julia/packages/Diffractor/6BWpf/src/stage1/recurse.jl:111 in expression starting at /home/aragorn/.julia/packages/Diffractor/6BWpf/src/stage1/generated.jl:7 in expression starting at /home/aragorn/.julia/packages/Diffractor/6BWpf/src/Diffractor.jl:1 in expression starting at stdin:1

Push Cthulhu out to an Weak Dep?

Depending on Cthulu means depending on JuliaSyntax and a bunch of other stuff, since it is a user-facing tool.
We could make it a Weak Dep

Some errors from broadcasting

julia> gradient(x -> sum(abs2.(x)), [1,2,3.])
([2.0, 4.0, 6.0],)

julia> gradient(x -> sum(x .^ 2), [1,2,3.])
ERROR: MethodError: no method matching (::Diffractor.∂⃖recurse{1})(::typeof(Core._apply_iterate), ::typeof(iterate), ::typeof(Core.apply_type), ::Tuple{DataType, DataType}, ::Core.SimpleVector)

julia> gradient(x -> sum(x .* x'), [1,2,3.])
([12.0, 12.0, 12.0],)

julia> gradient(x -> sum(x ./ x'), [1,2,3.])
ERROR: Rewrite reached intrinsic function bitcast. Missing rule?

The last gives a different error in forward mode, but I'm less sure that I'm doing it right.

Details
julia> using Diffractor: gradient, frule_via_ad, unthunk, DiffractorRuleConfig

julia> gradient(x -> sum(abs2.(x)), [1,2,3.])
([2.0, 4.0, 6.0],)

julia> gradient(x -> sum(x .^ 2), [1,2,3.])
ERROR: MethodError: no method matching (::Diffractor.∂⃖recurse{1})(::typeof(Core._apply_iterate), ::typeof(iterate), ::typeof(Core.apply_type), ::Tuple{DataType, DataType}, ::Core.SimpleVector)
Closest candidates are:
  (::Diffractor.∂⃖recurse)(::Any...) at /Users/me/.julia/dev/Diffractor/src/stage1/generated.jl:408
Stacktrace:
  [1] macro expansion
    @ ~/.julia/dev/Diffractor/src/stage1/generated.jl:0 [inlined]
  [2] (::Diffractor.∂⃖recurse{1})(::typeof(Core._apply_iterate), ::typeof(iterate), ::typeof(Core.apply_type), ::Tuple{DataType, DataType}, ::Core.SimpleVector)
    @ Diffractor ~/.julia/dev/Diffractor/src/stage1/generated.jl:417
  [3] (::∂⃖{1})(::typeof(Core._apply_iterate), ::Function, ::Vararg{Any})
    @ Diffractor ~/.julia/dev/Diffractor/src/stage1/generated.jl:215
  [4] eltypes
    @ ./broadcast.jl:714 [inlined]
  [5] (::Diffractor.∂⃖recurse{1})(::typeof(Base.Broadcast.eltypes), ::Tuple{Base.RefValue{typeof(^)}, Vector{Float64}, Base.RefValue{Val{2}}})
    @ Diffractor ~/.julia/dev/Diffractor/src/stage1/generated.jl:0
  [6] (::∂⃖{1})(f::typeof(Base.Broadcast.eltypes), args::Tuple{Base.RefValue{typeof(^)}, Vector{Float64}, Base.RefValue{Val{2}}})
    @ Diffractor ~/.julia/dev/Diffractor/src/stage1/generated.jl:215
  [7] combine_eltypes
    @ ./broadcast.jl:717 [inlined]
  [8] (::Diffractor.∂⃖recurse{1})(::typeof(Base.Broadcast.combine_eltypes), ::typeof(Base.literal_pow), ::Tuple{Base.RefValue{typeof(^)}, Vector{Float64}, Base.RefValue{Val{2}}})
    @ Diffractor ~/.julia/dev/Diffractor/src/stage1/generated.jl:0
  [9] (::∂⃖{1})(::typeof(Base.Broadcast.combine_eltypes), ::Function, ::Vararg{Any})
    @ Diffractor ~/.julia/dev/Diffractor/src/stage1/generated.jl:215
 [10] copy
    @ ./broadcast.jl:882 [inlined]
 [11] (::∂⃖{1})(f::typeof(copy), args::Base.Broadcast.Broadcasted{Base.Broadcast.DefaultArrayStyle{1}, Tuple{Base.OneTo{Int64}}, typeof(Base.literal_pow), Tuple{Base.RefValue{typeof(^)}, Vector{Float64}, Base.RefValue{Val{2}}}})
    @ Diffractor ~/.julia/dev/Diffractor/src/stage1/generated.jl:215
 [12] materialize
    @ ./broadcast.jl:860 [inlined]
 [13] (::∂⃖{1})(f::typeof(Base.Broadcast.materialize), args::Base.Broadcast.Broadcasted{Base.Broadcast.DefaultArrayStyle{1}, Nothing, typeof(Base.literal_pow), Tuple{Base.RefValue{typeof(^)}, Vector{Float64}, Base.RefValue{Val{2}}}})
    @ Diffractor ~/.julia/dev/Diffractor/src/stage1/generated.jl:215
 [14] #126
    @ ./REPL[100]:1 [inlined]
 [15] (::Diffractor.∂⃖recurse{1})(::var"#126#127", ::Vector{Float64})
    @ Diffractor ~/.julia/dev/Diffractor/src/stage1/generated.jl:0
 [16] (::∂⃖{1})(f::var"#126#127", args::Vector{Float64})
    @ Diffractor ~/.julia/dev/Diffractor/src/stage1/generated.jl:215
 [17] ∂⃖(::Function, ::Vararg{Any})
    @ Diffractor ~/.julia/dev/Diffractor/src/interface.jl:25
 [18] (::Diffractor.∇{var"#126#127"})(args::Vector{Float64})
    @ Diffractor ~/.julia/dev/Diffractor/src/interface.jl:121
 [19] Diffractor.(::Function, ::Vector{Float64})
    @ Diffractor ~/.julia/dev/Diffractor/src/interface.jl:128
 [20] top-level scope
    @ REPL[100]:1

julia> gradient(x -> sum(x .* x'), [1,2,3.])
([12.0, 12.0, 12.0],)

julia> gradient(x -> sum(x ./ x'), [1,2,3.])
ERROR: Rewrite reached intrinsic function bitcast. Missing rule?
Stacktrace:
  [1] error(s::String)
    @ Base ./error.jl:33
  [2] (::∂⃖{1})(::Core.IntrinsicFunction, ::Type, ::Vararg{Any})
    @ Diffractor ~/.julia/dev/Diffractor/src/stage1/generated.jl:205
  [3] UInt64
    @ ./boot.jl:768 [inlined]
  [4] (::Diffractor.∂⃖recurse{1})(::Type{UInt64}, ::Ptr{Float64})
    @ Diffractor ~/.julia/dev/Diffractor/src/stage1/generated.jl:0
  [5] (::∂⃖{1})(f::Type{UInt64}, args::Ptr{Float64})
    @ Diffractor ~/.julia/dev/Diffractor/src/stage1/generated.jl:215
  [6] dataids
    @ ./abstractarray.jl:1457 [inlined]
  [7] (::Diffractor.∂⃖recurse{1})(::typeof(Base.dataids), ::Matrix{Float64})
    @ Diffractor ~/.julia/dev/Diffractor/src/stage1/generated.jl:0
  [8] (::∂⃖{1})(f::typeof(Base.dataids), args::Matrix{Float64})
    @ Diffractor ~/.julia/dev/Diffractor/src/stage1/generated.jl:215
  [9] mightalias
    @ ./abstractarray.jl:1433 [inlined]
 [10] ∂⃖
    @ ~/.julia/dev/Diffractor/src/stage1/generated.jl:215 [inlined]
 [11] unalias
    @ ./abstractarray.jl:1398 [inlined]
 [12] ∂⃖
    @ ~/.julia/dev/Diffractor/src/stage1/generated.jl:215 [inlined]
 [13] broadcast_unalias
    @ ./broadcast.jl:934 [inlined]
 [14] ∂⃖
    @ ~/.julia/dev/Diffractor/src/stage1/generated.jl:215 [inlined]
 [15] preprocess
    @ ./broadcast.jl:941 [inlined]
 [16] ∂⃖
    @ ~/.julia/dev/Diffractor/src/stage1/generated.jl:215 [inlined]
 [17] preprocess_args
    @ ./broadcast.jl:943 [inlined]
 [18] ∂⃖
    @ ~/.julia/dev/Diffractor/src/stage1/generated.jl:215 [inlined]
 [19] preprocess
    @ ./broadcast.jl:940 [inlined]
 [20] ∂⃖
    @ ~/.julia/dev/Diffractor/src/stage1/generated.jl:215 [inlined]
 [21] copyto!
    @ ./broadcast.jl:957 [inlined]
 [22] (::Diffractor.∂⃖recurse{1})(::typeof(copyto!), ::Matrix{Float64}, ::Base.Broadcast.Broadcasted{Nothing, Tuple{Base.OneTo{Int64}, Base.OneTo{Int64}}, typeof(/), Tuple{Vector{Float64}, Adjoint{Float64, Vector{Float64}}}})
    @ Diffractor ~/.julia/dev/Diffractor/src/stage1/generated.jl:0
 [23] (::∂⃖{1})(::typeof(copyto!), ::Matrix{Float64}, ::Vararg{Any})
    @ Diffractor ~/.julia/dev/Diffractor/src/stage1/generated.jl:215
 [24] copyto!
    @ ./broadcast.jl:913 [inlined]
 [25] ∂⃖
    @ ~/.julia/dev/Diffractor/src/stage1/generated.jl:215 [inlined]
 [26] copy
    @ ./broadcast.jl:885 [inlined]
 [27] ∂⃖
    @ ~/.julia/dev/Diffractor/src/stage1/generated.jl:215 [inlined]
 [28] materialize
    @ ./broadcast.jl:860 [inlined]
 [29] (::∂⃖{1})(f::typeof(Base.Broadcast.materialize), args::Base.Broadcast.Broadcasted{Base.Broadcast.DefaultArrayStyle{2}, Nothing, typeof(/), Tuple{Vector{Float64}, Adjoint{Float64, Vector{Float64}}}})
    @ Diffractor ~/.julia/dev/Diffractor/src/stage1/generated.jl:215
 [30] #130
    @ ./REPL[102]:1 [inlined]
 [31] ∂⃖
    @ ~/.julia/dev/Diffractor/src/stage1/generated.jl:215 [inlined]
 [32] ∂⃖
    @ ~/.julia/dev/Diffractor/src/interface.jl:25 [inlined]
 [33] (::Diffractor.∇{var"#130#131"})(args::Vector{Float64})
    @ Diffractor ~/.julia/dev/Diffractor/src/interface.jl:121
 [34] Diffractor.(::Function, ::Vector{Float64})
    @ Diffractor ~/.julia/dev/Diffractor/src/interface.jl:128
 [35] top-level scope
    @ REPL[102]:1

julia> using LinearAlgebra

julia> function fjac(f, x)
         delta = Matrix(I, length(x), length(x))
         slice(k) = vec(frule_via_ad(DiffractorRuleConfig(), (0, reshape(delta[:,k], axes(x))), f, x)[2])
         reduce(hcat, [slice(k) for k in LinearIndices(x)])
       end
fjac (generic function with 1 method)

julia> fjac(x -> x .^ 2, [1,2,3.])
3×3 Matrix{Float64}:
 2.0  0.0  0.0
 0.0  4.0  0.0
 0.0  0.0  6.0

julia> fjac(x -> x ./ x', [1,2,3.])
ERROR: MethodError: no method matching unbundle(::Diffractor.CompositeBundle{1, Adjoint{Float64, Vector{Float64}}, Tuple{Diffractor.TangentBundle{1, Vector{Float64}, Tuple{Vector{Bool}}}}})
Closest candidates are:
  unbundle(::Diffractor.TangentBundle{Order, A}) where {Order, Dim, T, A<:AbstractArray{T, Dim}} at /Users/me/.julia/dev/Diffractor/src/tangent.jl:229
  unbundle(::Diffractor.TaylorBundle{Order, A}) where {Order, Dim, T, A<:AbstractArray{T, Dim}} at /Users/me/.julia/dev/Diffractor/src/tangent.jl:247
  unbundle(::Diffractor.UniformBundle{N, A, ZeroTangent}) where {N, T, Dim, A<:AbstractArray{T, Dim}} at /Users/me/.julia/dev/Diffractor/src/tangent.jl:272
Stacktrace:
  [1] (::Diffractor.var"#236#237"{1, ∂☆{1}, Diffractor.CompositeBundle{1, Tuple{Vector{Float64}, Adjoint{Float64, Vector{Float64}}}, Tuple{Diffractor.TangentBundle{1, Vector{Float64}, Tuple{Vector{Bool}}}, Diffractor.CompositeBundle{1, Adjoint{Float64, Vector{Float64}}, Tuple{Diffractor.TangentBundle{1, Vector{Float64}, Tuple{Vector{Bool}}}}}}}})(i::Int64)
    @ Diffractor ~/.julia/dev/Diffractor/src/stage1/broadcast.jl:23
  [2] ntuple(f::Diffractor.var"#236#237"{1, ∂☆{1}, Diffractor.CompositeBundle{1, Tuple{Vector{Float64}, Adjoint{Float64, Vector{Float64}}}, Tuple{Diffractor.TangentBundle{1, Vector{Float64}, Tuple{Vector{Bool}}}, Diffractor.CompositeBundle{1, Adjoint{Float64, Vector{Float64}}, Tuple{Diffractor.TangentBundle{1, Vector{Float64}, Tuple{Vector{Bool}}}}}}}}, n::Int64)
    @ Base ./ntuple.jl:19
  [3] (::∂☆{1})(zc::Diffractor.UniformBundle{1, typeof(copy), ZeroTangent}, bc::Diffractor.CompositeBundle{1, Base.Broadcast.Broadcasted{Base.Broadcast.DefaultArrayStyle{2}, Tuple{Base.OneTo{Int64}, Base.OneTo{Int64}}, typeof(/), Tuple{Vector{Float64}, Adjoint{Float64, Vector{Float64}}}}, Tuple{Diffractor.UniformBundle{1, typeof(/), ZeroTangent}, Diffractor.CompositeBundle{1, Tuple{Vector{Float64}, Adjoint{Float64, Vector{Float64}}}, Tuple{Diffractor.TangentBundle{1, Vector{Float64}, Tuple{Vector{Bool}}}, Diffractor.CompositeBundle{1, Adjoint{Float64, Vector{Float64}}, Tuple{Diffractor.TangentBundle{1, Vector{Float64}, Tuple{Vector{Bool}}}}}}}, Diffractor.CompositeBundle{1, Tuple{Base.OneTo{Int64}, Base.OneTo{Int64}}, Tuple{Diffractor.TangentBundle{1, Base.OneTo{Int64}, Tuple{ChainRulesCore.NoTangent}}, Diffractor.TangentBundle{1, Base.OneTo{Int64}, Tuple{ChainRulesCore.NoTangent}}}}}})
    @ Diffractor ~/.julia/dev/Diffractor/src/stage1/broadcast.jl:16
  [4] ∂☆recurse
    @ ./broadcast.jl:860 [inlined]
  [5] (::Diffractor.∂☆recurse{1})(::Diffractor.UniformBundle{1, typeof(Base.Broadcast.materialize), ZeroTangent}, ::Diffractor.CompositeBundle{1, Base.Broadcast.Broadcasted{Base.Broadcast.DefaultArrayStyle{2}, Nothing, typeof(/), Tuple{Vector{Float64}, Adjoint{Float64, Vector{Float64}}}}, Tuple{Diffractor.UniformBundle{1, typeof(/), ZeroTangent}, Diffractor.CompositeBundle{1, Tuple{Vector{Float64}, Adjoint{Float64, Vector{Float64}}}, Tuple{Diffractor.TangentBundle{1, Vector{Float64}, Tuple{Vector{Bool}}}, Diffractor.CompositeBundle{1, Adjoint{Float64, Vector{Float64}}, Tuple{Diffractor.TangentBundle{1, Vector{Float64}, Tuple{Vector{Bool}}}}}}}, Diffractor.UniformBundle{1, Nothing, ZeroTangent}}})
    @ Diffractor ~/.julia/dev/Diffractor/src/stage1/recurse_fwd.jl:0
  [6] (::Diffractor.∂☆internal{1})(::Diffractor.UniformBundle{1, typeof(Base.Broadcast.materialize), ZeroTangent}, ::Vararg{Diffractor.AbstractTangentBundle{1}})
    @ Diffractor ~/.julia/dev/Diffractor/src/stage1/forward.jl:115
  [7] (::∂☆{1})(::Diffractor.UniformBundle{1, typeof(Base.Broadcast.materialize), ZeroTangent}, ::Vararg{Diffractor.AbstractTangentBundle{1}})
    @ Diffractor ~/.julia/dev/Diffractor/src/stage1/forward.jl:140
  [8] ∂☆recurse
    @ ./REPL[104]:1 [inlined]
  [9] (::Diffractor.∂☆recurse{1})(::Diffractor.TangentBundle{1, var"#134#135", Tuple{Int64}}, ::Diffractor.TangentBundle{1, Vector{Float64}, Tuple{Vector{Bool}}})
    @ Diffractor ~/.julia/dev/Diffractor/src/stage1/recurse_fwd.jl:0
 [10] (::Diffractor.∂☆internal{1})(::Diffractor.TangentBundle{1, var"#134#135", Tuple{Int64}}, ::Vararg{Diffractor.AbstractTangentBundle{1}})
    @ Diffractor ~/.julia/dev/Diffractor/src/stage1/forward.jl:115
 [11] frule_via_ad(::DiffractorRuleConfig, ::Tuple{Int64, Vector{Bool}}, ::Function, ::Vararg{Any})
    @ Diffractor ~/.julia/dev/Diffractor/src/stage1/forward.jl:123
 [12] (::var"#slice#122"{var"#134#135", Vector{Float64}, Matrix{Bool}})(k::Int64)
    @ Main ./REPL[98]:3
 [13] #121
    @ ./none:0 [inlined]
 [14] iterate
    @ ./generator.jl:47 [inlined]
 [15] collect(itr::Base.Generator{LinearIndices{1, Tuple{Base.OneTo{Int64}}}, var"#121#123"{var"#slice#122"{var"#134#135", Vector{Float64}, Matrix{Bool}}}})
    @ Base ./array.jl:774
 [16] fjac(f::Function, x::Vector{Float64})
    @ Main ./REPL[98]:4
 [17] top-level scope
    @ REPL[104]:1

julia> versioninfo()
Julia Version 1.8.0-DEV.502
Commit c4f0d8b2f6 (2021-09-10 16:28 UTC)
Platform Info:
  OS: macOS (arm64-apple-darwin20.6.0)
  CPU: Apple M1
  WORD_SIZE: 64
  LIBM: libopenlibm
  LLVM: libLLVM-12.0.1 (ORCJIT, cyclone)
Environment:
  JULIA_NUM_THREADS = 4

Edit -- maybe this isn't meant to work? I thought it copied Zygote's unfused rrule(::typeof(broadcasted), f, xs::Numeric...) but in fact I see rules only for a handful of functions, except when there is just one argument.

Tangent being Constructed with wrong type in forward mode

Found this when trying to reproduce something else.
I though we had fixed these but I guess not

julia> using Diffractor

julia> var"'" = Diffractor.PrimeDerivativeFwd
Diffractor.PrimeDerivativeFwd

julia> struct Foo
       x::Float64
       end

julia> f(x) = Foo(x)
f (generic function with 1 method)

julia> f(23)
Foo(23.0)

julia> f'(23)
ERROR: ArgumentError: Tangent for the primal Foo should be backed by a NamedTuple type, not by Tuple{Int64}.
Stacktrace:
 [1] _backing_error(P::Type, G::Type, E::Type)
   @ ChainRulesCore ~/.julia/packages/ChainRulesCore/0t04l/src/tangent_types/tangent.jl:62
 [2] ChainRulesCore.Tangent{Foo, Tuple{Int64}}(backing::Tuple{Int64})
   @ ChainRulesCore ~/.julia/packages/ChainRulesCore/0t04l/src/tangent_types/tangent.jl:36
 [3] (ChainRulesCore.Tangent{Foo})(args::Int64)
   @ ChainRulesCore ~/.julia/packages/ChainRulesCore/0t04l/src/tangent_types/tangent.jl:48
 [4] getindex(tb::Diffractor.CompositeBundle{1, Foo, Tuple{Diffractor.TangentBundle{1, Float64, Diffractor.TaylorTangent{Tuple{Int64}}}}}, tti::Diffractor.TaylorTangentIndex)
   @ Diffractor ~/JuliaEnvs/Diffractor.jl/src/tangent.jl:299
 [5] (::Diffractor.PrimeDerivativeFwd{1, typeof(f)})(x::Int64)
   @ Diffractor ~/JuliaEnvs/Diffractor.jl/src/interface.jl:180
 [6] top-level scope
   @ REPL[9]:1

Mutating in closure

julia> using Diffractor

julia> D(f, x) = Diffractor.PrimeDerivativeFwd(f)(x)
D (generic function with 1 method)

julia> D(1) do x
         f(y) = (x = x*y)
         D(f, 1)
         D(f, 1)
       end
ERROR: MethodError: no method matching apply_type(::Type{Diffractor.PrimeDerivativeFwd}, ::Int64, ::Type{var"#f#12"})
Stacktrace:
  [1] ∂☆nomethd(args::Any)
    @ Diffractor ~/.julia/dev/Diffractor/src/stage1/recurse_fwd.jl:115

Example from a ForwardDiff issue by Mike Innes: JuliaDiff/ForwardDiff.jl#443

Bounds error in transform! when taking gradients of functions with dict data

Below fails with BoundsError: attempt to access 3-element Vector{Core.Compiler.BasicBlock} at index [4]

Julia Version 1.10.0-DEV.434
Commit 0231c224a1f (2023-01-26 20:27 UTC)

foo = Dict{Symbol,Float32}()
foo[:a] = 30.0f0
foo[:b] = 40.0f0

struct Something
    data::Dict{Symbol,Float32}
end

s = Something(foo)

function stuff(x::Something, t::Float32)
    x.data[:a] * t*t + x.data[:b]*t
end

Diffractor.gradient(y -> stuff(y, 3.0f0), s)

`opaque_closure_method: invalid syntax`

on master:

julia> ((x,y) -> x*y)(1,2)
ERROR: opaque_closure_method: invalid syntax
Stacktrace:
 [1] (::Diffractor.∂⃖{1})(::var"#3#4", ::Int64, ::Vararg{Int64})
   @ Diffractor ~/.julia/dev/Diffractor/src/stage1/generated.jl:215
 [2] Diffractor.∂⃖(::Function, ::Vararg{Any})
   @ Diffractor ~/.julia/dev/Diffractor/src/interface.jl:25
 [3] (::∇{var"#3#4"})(::Int64, ::Vararg{Int64})
   @ Diffractor ~/.julia/dev/Diffractor/src/interface.jl:121
 [4] top-level scope
   @ REPL[19]:1

julia> versioninfo()
Julia Version 1.9.0-DEV.470
Commit 902a5c199d (2022-05-05 15:00 UTC)

UndefVarError: @aggressive_constprop not defined in precompiling

Hi there, I used ] add https://github.com/JuliaDiff/Diffractor.jl.git to get Diffractor, but I met some precompiling error:

julia> using Diffractor 
[ Info: Precompiling Diffractor [9f5e2b26-1114-432f-b630-d3fe2085c51c]
ERROR: LoadError: LoadError: LoadError: UndefVarError: @aggressive_constprop not defined
Stacktrace:
 [1] include(::Function, ::Module, ::String) at ./Base.jl:380
 [2] include at ./Base.jl:368 [inlined]
 [3] include(::String) at /root/.julia/packages/Diffractor/2Ott3/src/Diffractor.jl:1
 [4] top-level scope at /root/.julia/packages/Diffractor/2Ott3/src/Diffractor.jl:7
 [5] include(::Function, ::Module, ::String) at ./Base.jl:380
 [6] include(::Module, ::String) at ./Base.jl:368
 [7] top-level scope at none:2
 [8] eval at ./boot.jl:331 [inlined]
 [9] eval(::Expr) at ./client.jl:467
 [10] top-level scope at ./none:3
in expression starting at /root/.julia/packages/Diffractor/2Ott3/src/runtime.jl:3
in expression starting at /root/.julia/packages/Diffractor/2Ott3/src/runtime.jl:3
in expression starting at /root/.julia/packages/Diffractor/2Ott3/src/Diffractor.jl:7
ERROR: Failed to precompile Diffractor [9f5e2b26-1114-432f-b630-d3fe2085c51c] to /root/.julia/compiled/v1.5/Diffractor/vzwwW_E53n4.ji.
Stacktrace:
 [1] error(::String) at ./error.jl:33
 [2] compilecache(::Base.PkgId, ::String) at ./loading.jl:1305
 [3] _require(::Base.PkgId) at ./loading.jl:1030
 [4] require(::Base.PkgId) at ./loading.jl:928
 [5] require(::Module, ::Symbol) at ./loading.jl:923
 [6] eval at ./boot.jl:331 [inlined]
 [7] eval at ./Base.jl:39 [inlined]
 [8] repleval(::Module, ::Expr, ::String) at /root/.vscode-server-insiders/extensions/julialang.language-julia-1.3.30/scripts/packages/VSCodeServer/src/repl.jl:157
 [9] (::VSCodeServer.var"#69#71"{Module,Expr,REPL.LineEditREPL,REPL.LineEdit.Prompt})() at /root/.vscode-server-insiders/extensions/julialang.language-julia-1.3.30/scripts/packages/VSCodeServer/src/repl.jl:123
 [10] with_logstate(::Function, ::Any) at ./logging.jl:408
 [11] with_logger at ./logging.jl:514 [inlined]
 [12] (::VSCodeServer.var"#68#70"{Module,Expr,REPL.LineEditREPL,REPL.LineEdit.Prompt})() at /root/.vscode-server-insiders/extensions/julialang.language-julia-1.3.30/scripts/packages/VSCodeServer/src/repl.jl:124
 [13] #invokelatest#1 at ./essentials.jl:710 [inlined]
 [14] invokelatest(::Any) at ./essentials.jl:709
 [15] macro expansion at /root/.vscode-server-insiders/extensions/julialang.language-julia-1.3.30/scripts/packages/VSCodeServer/src/eval.jl:34 [inlined]
 [16] (::VSCodeServer.var"#53#54")() at ./task.jl:356

Precompilation fails on Julia v1.7.0-rc1

julia> import Pkg; Pkg.precompile()
Precompiling project...
  ✗ Diffractor
  0 dependencies successfully precompiled in 2 seconds (24 already precompiled)

ERROR: The following 1 direct dependency failed to precompile:

Diffractor [9f5e2b26-1114-432f-b630-d3fe2085c51c]

Failed to precompile Diffractor [9f5e2b26-1114-432f-b630-d3fe2085c51c] to /Users/sethaxen/.julia/compiled/v1.7/Diffractor/jl_Nt5Rnu.
ERROR: LoadError: UndefVarError: @constprop not defined
Stacktrace:
 [1] include(mod::Module, _path::String)
   @ Base ./Base.jl:420
 [2] include(x::String)
   @ Diffractor ~/.julia/packages/Diffractor/HYuxt/src/Diffractor.jl:1
 [3] top-level scope
   @ ~/.julia/packages/Diffractor/HYuxt/src/Diffractor.jl:7
 [4] include
   @ ./Base.jl:420 [inlined]
 [5] include_package_for_output(pkg::Base.PkgId, input::String, depot_path::Vector{String}, dl_load_path::Vector{String}, load_path::Vector{String}, concrete_deps::Vector{Pair{Base.PkgId, UInt64}}, source::Nothing)
   @ Base ./loading.jl:1318
 [6] top-level scope
   @ none:1
 [7] eval
   @ ./boot.jl:373 [inlined]
 [8] eval(x::Expr)
   @ Base.MainInclude ./client.jl:453
 [9] top-level scope
   @ none:1

Docs are broken

Seems like the latest dev docs are missing the new getting started section, so they must be a bit broken

TagBot trigger issue

This issue is used to trigger TagBot; feel free to unsubscribe.

If you haven't already, you should update your TagBot.yml to include issue comment triggers.
Please see this post on Discourse for instructions and more details.

If you'd like for me to do this for you, comment TagBot fix on this issue.
I'll open a PR within a few hours, please be patient!

Diffractor broken on Julia nightly due to https://github.com/JuliaLang/julia/pull/49113

Running the tests gives

Diffractor.jl: Error During Test at /home/oscardssmith/.julia/dev/Diffractor/test/runtests.jl:30
  Test threw exception
  Expression: tup2(my_tuple) == (ZeroTangent(), 4)
  MethodError: no method matching replace_code_newstyle!(::Core.CodeInfo, ::Core.Compiler.IRCode, ::Int64)
  
  Closest candidates are:
    replace_code_newstyle!(::Core.CodeInfo, ::Core.Compiler.IRCode)
     @ Core compiler/ssair/legacy.jl:67
  
  Stacktrace:
    [1] transform!(ci::Core.CodeInfo, meth::Method, nargs::Int64, sparams::Core.SimpleVector, N::Int64)
      @ Diffractor ~/.julia/dev/Diffractor/src/stage1/recurse.jl:284
    [2] perform_optic_transform(ff::Type{Diffractor.∂⃖recurse{1}}, args::Any)
      @ Diffractor ~/.julia/dev/Diffractor/src/stage1/generated.jl:25
    [3] (::Core.GeneratedFunctionStub)(::Any, ::Vararg{Any})
      @ Core ./boot.jl:602
    [4] ∂⃖
      @ ~/.julia/dev/Diffractor/src/stage1/generated.jl:216 [inlined]
    [5] ∂⃖
      @ ~/.julia/dev/Diffractor/src/stage1/generated.jl:66 [inlined]
    [6] (::var"#tup2#77")(f::var"#my_tuple#78")
      @ Main ~/.julia/dev/Diffractor/test/runtests.jl:20
    [7] macro expansion
      @ ~/julia/usr/share/julia/stdlib/v1.10/Test/src/Test.jl:478 [inlined]
    [8] macro expansion
      @ ~/.julia/dev/Diffractor/test/runtests.jl:30 [inlined]
    [9] macro expansion
      @ ~/julia/usr/share/julia/stdlib/v1.10/Test/src/Test.jl:1504 [inlined]
   [10] top-level scope
      @ ~/.julia/dev/Diffractor/test/runtests.jl:16

`x.re` gives a Tangent not a Complex

julia> gradient(x -> x.re, 2+3im)
(Tangent{Complex{Int64}}(re = 1,),)

julia> gradient(x -> abs2(x * x.re), 4+5im)
ERROR: MethodError: no method matching +(::ComplexF64, ::Tangent{Complex{Int64}, NamedTuple{(:re,), Tuple{Float64}}})

Even very basic broadcasting breaks inferability

Even very simple broadcasting can't infer in Diffractor forward mode

Consider:

using Diffractor
using Diffractor: TaylorBundle, ZeroBundle, CompositeBundle
forward = Diffractor.∂☆{1}()
f(x) = 2 .* x
cb = CompositeBundle{1, Tuple{Float64, Float64}}((TaylorBundle{1}(1.0, (1.0,)), TaylorBundle{1}(1.0, (2.0,))))
@code_warntype forward(ZeroBundle{1}(f), cb)

The output is

julia> @code_warntype forward(ZeroBundle{1}(f), cb)
MethodInstance for (::Diffractor.∂☆{1})(::ZeroBundle{1, typeof(f)}, ::CompositeBundle{1, Tuple{Float64, Float64}, Tuple{Diffractor.TangentBundle{1, Float64, Diffractor.TaylorTangent{Tuple{Float64}}}, Diffractor.TangentBundle{1, Float64, Diffractor.TaylorTangent{Tuple{Float64}}}}})
  from (::Diffractor.∂☆{N})(args::Diffractor.AbstractTangentBundle{N}...) where N @ Diffractor ~/.julia/packages/Diffractor/a9mlv/src/stage1/forward.jl:148
Static Parameters
  N = 1
Arguments
  #self#::Core.Const(Diffractor.∂☆{1}())
  args::Tuple{ZeroBundle{1, typeof(f)}, CompositeBundle{1, Tuple{Float64, Float64}, Tuple{Diffractor.TangentBundle{1, Float64, Diffractor.TaylorTangent{Tuple{Float64}}}, Diffractor.TangentBundle{1, Float64, Diffractor.TaylorTangent{Tuple{Float64}}}}}}
Body::Any
1%1 = Core.apply_type(Diffractor.∂☆internal, $(Expr(:static_parameter, 1)))::Core.Const(Diffractor.∂☆internal{1})
│   %2 = (%1)()::Core.Const(Diffractor.∂☆internal{1}())
│   %3 = Core._apply_iterate(Base.iterate, %2, args)::Any
└──      return %3

Control flow not supported in higher order reverse mode AD

julia> using Diffractor: var"'"

julia> (x->x*x*x*x)''(1.0)
stmt = :(goto %3 if not %7)
stmt = :(goto %3 if not %7)
stmt = :(goto %3 if not %7)
stmt = :(goto %3 if not %7)
stmt = :(goto %3 if not %7)
ERROR: 
Stacktrace:
  [1] error()
    @ Base ./error.jl:42
  [2] transform!(ci::Core.CodeInfo, meth::Method, nargs::Int64, sparams::Core.SimpleVector, N::Int64)
    @ Diffractor ~/.julia/packages/Diffractor/A8Ou2/src/stage1/recurse.jl:709
  [3] perform_optic_transform(ff::Type{Diffractor.∂⃖recurse{2}}, args::Any)
    @ Diffractor ~/.julia/packages/Diffractor/A8Ou2/src/stage1/generated.jl:25
  [4] (::Core.GeneratedFunctionStub)(::Any, ::Vararg{Any})
    @ Core ./boot.jl:580
  [5] (::Diffractor.∂⃖{2})(::typeof(Base.afoldl), ::Function, ::Vararg{Any})
    @ Diffractor ~/.julia/packages/Diffractor/A8Ou2/src/stage1/generated.jl:218
  [6] (::Diffractor.∂⃖{2})(::typeof(Core._apply_iterate), ::Function, ::Function, ::Tuple{typeof(*), Float64}, ::Vararg{Union{Tuple, NamedTuple}})
    @ Diffractor ~/.julia/packages/Diffractor/A8Ou2/src/stage1/generated.jl:364
  [7] *
    @ ./operators.jl:655 [inlined]
  [8] ∂⃖
    @ ~/.julia/packages/Diffractor/A8Ou2/src/extra_rules.jl:85 [inlined]
  [9] #1
    @ ./REPL[2]:1 [inlined]
 [10] ∂⃖
    @ ~/.julia/packages/Diffractor/A8Ou2/src/stage1/generated.jl:218 [inlined]
 [11] ∂⃖
    @ ~/.julia/packages/Diffractor/A8Ou2/src/stage1/generated.jl:61 [inlined]
 [12] PrimeDerivativeBack
    @ ~/.julia/packages/Diffractor/A8Ou2/src/interface.jl:157 [inlined]
 [13] (::Diffractor.∂⃖recurse{1})(::Diffractor.PrimeDerivativeBack{1, var"#1#2"}, ::Float64)
    @ Diffractor ~/.julia/packages/Diffractor/A8Ou2/src/stage1/generated.jl:0
 [14] ∂⃖
    @ ~/.julia/packages/Diffractor/A8Ou2/src/stage1/generated.jl:211 [inlined]
 [15] (::Diffractor.PrimeDerivativeBack{2, var"#1#2"})(x::Float64)
    @ Diffractor ~/.julia/packages/Diffractor/A8Ou2/src/interface.jl:157
 [16] top-level scope
    @ REPL[2]:1

Splatting arrays

Diffractor doesn't seem to like array splats:

julia> gradient(x -> max(x...), (1,2,3))[1]
(0.0, 0.0, 1.0)

julia> gradient(x -> max(x...), [1,2,3])[1]
ERROR: MethodError: no method matching (::Diffractor.∂⃖recurse{1})(::typeof(Core._apply_iterate), ::typeof(iterate), ::typeof(max), ::Vector{Int64})
Closest candidates are:
  (::Diffractor.∂⃖recurse)(::Any...) at /Users/me/.julia/packages/Diffractor/2Ott3/src/stage1/generated.jl:398
Stacktrace:
  [1] macro expansion
    @ ~/.julia/packages/Diffractor/2Ott3/src/stage1/generated.jl:0 [inlined]
  [2] (::Diffractor.∂⃖recurse{1})(::typeof(Core._apply_iterate), ::typeof(iterate), ::typeof(max), ::Vector{Int64})
    @ Diffractor ~/.julia/packages/Diffractor/2Ott3/src/stage1/generated.jl:407
  [3] (::∂⃖{1})(::typeof(Core._apply_iterate), ::Function, ::Vararg{Any})
    @ Diffractor ~/.julia/packages/Diffractor/2Ott3/src/stage1/generated.jl:214
  [4] #9
    @ ./REPL[6]:1 [inlined]
  [5] (::Diffractor.∂⃖recurse{1})(::var"#9#10", ::Vector{Int64})
    @ Diffractor ~/.julia/packages/Diffractor/2Ott3/src/stage1/generated.jl:0
  [6] (::∂⃖{1})(f::var"#9#10", args::Vector{Int64})
    @ Diffractor ~/.julia/packages/Diffractor/2Ott3/src/stage1/generated.jl:214
  [7] ∂⃖(::Function, ::Vararg{Any})
    @ Diffractor ~/.julia/packages/Diffractor/2Ott3/src/interface.jl:25
  [8] (::Diffractor.∇{var"#9#10"})(args::Vector{Int64})
    @ Diffractor ~/.julia/packages/Diffractor/2Ott3/src/interface.jl:121
  [9] Diffractor.∇(::Function, ::Vector{Int64})
    @ Diffractor ~/.julia/packages/Diffractor/2Ott3/src/interface.jl:128

Zygote also gets this wrong: FluxML/Zygote.jl#599

Diffractor can't differentiate through `view`

I'm not completely sure if this is genuinely an error, or just a result of how Diffractor is calculating gradients, but it seems that Diffractor struggles with array views. As with #40, this is an issue related to TransformVariables.jl.

MWE:

A=[1,2,3,4]
into(v::AbstractVector, i, len) = v[i:(i+len-1)]
Diffractor.∂⃖recurse{1}()(into, A, 1, 2) #Works

view_into(v::AbstractVector, i, len) = view(v, i:(i+len-1))
Diffractor.∂⃖recurse{1}()(view_into, A, 1, 2) #Doesn't work

Stack Trace for view_intro:

ERROR: Rewrite reached intrinsic function and_int. Missing rule?
Stacktrace:
  [1] error(s::String)
    @ Base ./error.jl:33
  [2] (::∂⃖{1})(::Core.IntrinsicFunction, ::Bool, ::Vararg{Bool})
    @ Diffractor ~/data/code/DeepPumas.jl/EnvironmentCopy/dev/Diffractor.jl/src/stage1/generated.jl:204
  [3] &
    @ ./bool.jl:38 [inlined]
  [4] (::Diffractor.∂⃖recurse{1})(::typeof(&), ::Bool, ::Bool)
    @ Diffractor ~/data/code/DeepPumas.jl/EnvironmentCopy/dev/Diffractor.jl/src/stage1/generated.jl:0
  [5] (::∂⃖{1})(::typeof(&), ::Bool, ::Vararg{Bool})
    @ Diffractor ~/data/code/DeepPumas.jl/EnvironmentCopy/dev/Diffractor.jl/src/stage1/generated.jl:214
  [6] checkindex
    @ ./abstractarray.jl:718 [inlined]
  [7] (::Diffractor.∂⃖recurse{1})(::typeof(checkindex), ::Type{Bool}, ::Base.OneTo{Int64}, ::Int64)
    @ Diffractor ~/data/code/DeepPumas.jl/EnvironmentCopy/dev/Diffractor.jl/src/stage1/generated.jl:0
  [8] (::∂⃖{1})(::typeof(checkindex), ::Type, ::Vararg{Any})
    @ Diffractor ~/data/code/DeepPumas.jl/EnvironmentCopy/dev/Diffractor.jl/src/stage1/generated.jl:214
  [9] checkindex
    @ ./abstractarray.jl:723 [inlined]
 [10] (::Diffractor.∂⃖recurse{1})(::typeof(checkindex), ::Type{Bool}, ::Base.OneTo{Int64}, ::UnitRange{Int64})
    @ Diffractor ~/data/code/DeepPumas.jl/EnvironmentCopy/dev/Diffractor.jl/src/stage1/generated.jl:0
 [11] (::∂⃖{1})(::typeof(checkindex), ::Type, ::Vararg{Any})
    @ Diffractor ~/data/code/DeepPumas.jl/EnvironmentCopy/dev/Diffractor.jl/src/stage1/generated.jl:214
 [12] ∂⃖recurse
    @ ./abstractarray.jl:644 [inlined]
 [13] (::Diffractor.∂⃖recurse{1})(::typeof(checkbounds), ::Type{Bool}, ::Vector{Int64}, ::UnitRange{Int64})
    @ Diffractor ~/data/code/DeepPumas.jl/EnvironmentCopy/dev/Diffractor.jl/src/stage1/generated.jl:0
 [14] (::∂⃖{1})(::typeof(checkbounds), ::Type, ::Vararg{Any})
    @ Diffractor ~/data/code/DeepPumas.jl/EnvironmentCopy/dev/Diffractor.jl/src/stage1/generated.jl:214
 [15] (::∂⃖{1})(::typeof(Core._apply_iterate), ::Function, ::Function, ::Tuple{DataType, Vector{Int64}}, ::Vararg{Union{Tuple, Vector, NamedTuple}})
    @ Diffractor ~/data/code/DeepPumas.jl/EnvironmentCopy/dev/Diffractor.jl/src/stage1/generated.jl:371
 [16] checkbounds
    @ ./abstractarray.jl:659 [inlined]
 [17] ∂⃖
    @ ~/data/code/DeepPumas.jl/EnvironmentCopy/dev/Diffractor.jl/src/stage1/generated.jl:214 [inlined]
 [18] ∂⃖
    @ ~/data/code/DeepPumas.jl/EnvironmentCopy/dev/Diffractor.jl/src/stage1/generated.jl:371 [inlined]
 [19] view
    @ ./subarray.jl:177 [inlined]
 [20] (::Diffractor.∂⃖recurse{1})(::typeof(view), ::Vector{Int64}, ::UnitRange{Int64})
    @ Diffractor ~/data/code/DeepPumas.jl/EnvironmentCopy/dev/Diffractor.jl/src/stage1/generated.jl:0
 [21] (::∂⃖{1})(::typeof(view), ::Vector{Int64}, ::Vararg{Any})
    @ Diffractor ~/data/code/DeepPumas.jl/EnvironmentCopy/dev/Diffractor.jl/src/stage1/generated.jl:214
 [22] ∂⃖recurse
    @ ./REPL[21739]:1 [inlined]
 [23] (::Diffractor.∂⃖recurse{1})(::typeof(view_into3), ::Vector{Int64}, ::Int64, ::Int64)
    @ Diffractor ~/data/code/DeepPumas.jl/EnvironmentCopy/dev/Diffractor.jl/src/stage1/generated.jl:0
 [24] top-level scope
    @ REPL[21741]:1

Cannot differentiate through `isdefined`

julia> using Diffractor, OrdinaryDiffEq, ForwardDiff

julia> function foo(tf)
           prob = ODEProblem((u,p,t)->1.01u, tf, (0.0, tf))
           solve(prob, Tsit5())[end]
       end
foo (generic function with 1 method)

julia> ForwardDiff.derivative(foo, 1.0)
5.518656256946362

julia> Diffractor.PrimeDerivativeBack(foo)(1.0)
ERROR: MethodError: no method matching (::Diffractor.∂⃖recurse{1})(::typeof(isdefined), ::var"#2#3", ::Symbol)
Closest candidates are:
  (::Diffractor.∂⃖recurse)(::Any...) at /Users/scheme/src/julia/Diffractor/src/stage1/generated.jl:381
Stacktrace:
  [1] macro expansion
    @ ~/src/julia/Diffractor/src/stage1/generated.jl:0 [inlined]
  [2] (::Diffractor.∂⃖recurse{1})(::typeof(isdefined), ::var"#2#3", ::Symbol)
    @ Diffractor ~/src/julia/Diffractor/src/stage1/generated.jl:390

forward_diff_no_inf can't handle throw_undef_if_not

Ran into this while trying to reproduce another bug.
The following code errors:

using Diffractor
# this is needed as transform! is *always* called on Arguments regardless of what visit_custom says
identity_transform!(ir, ssa::Core.SSAValue, order) = ir[ssa]
function identity_transform!(ir, arg::Core.Argument, order)
    return Core.Compiler.insert_node!(ir, Core.SSAValue(1), Core.Compiler.NewInstruction(Expr(:call, Diffractor.ZeroBundle{1}, arg), Any))
end

function phi_run(x::Float64, b1::Bool, b2::Bool)
    if b1
        a = 0.
    elseif b2
        a = z
    end
    return x - a
end

input_ir = first(only(Base.code_ircode(phi_run, Tuple{Float64, Bool, Bool})))
Diffractor.forward_diff_no_inf!(copy(input_ir), Core.SSAValue.(1:length(input_ir.stmts)) .=> 1; transform! = identity_transform!)

I think the fix is easy, add the case to forward_visit about what to do for this kind of node.
Which i think is return nothing

Reverse mode tests broken

The reverse mode tests were broken by JuliaLang/julia@ec33194 and that might be a heavy lift to fix. We'll check in with Keno tomorrow to see how bad it's going to be to fix that (he says it's a miscompile, which sounds bad). If it's too too painful we can either disable those reverse-mode tests (since we're not using them anyway), look into some kind of workaround, or even merge this with the reverse mode tests failing.

Originally posted by @staticfloat in #168 (comment)

3 statement switches

This is a minor annoyance for maintenance. Which complicates things when you realize that the IR has encountered something unexpected, or is doing something wrong, you need to look in 3 places.

These are each basically big switch statements, that decide how to AD particular IR statments

It seems like we should only have 2.
That we can perhaps collapse forward as a special case of Forward-Demand where you demand everything or some such.

But they are not trivial to collapse.

`gradient(dropdims)` fails

julia> gradient(x ->dropdims(x; dims=1)[1], rand(1,3))
ERROR: TypeError: in typeassert, expected Int64, got a value of type Nothing
Stacktrace:
  [1] cfg_delete_edge!(cfg::Core.Compiler.CFG, from::Int64, to::Int64)
    @ Core.Compiler ./compiler/ssair/ir.jl:27
  [2] split_critical_edges!(ir::Core.Compiler.IRCode)
    @ Diffractor ~/.julia/dev/Diffractor/src/stage1/recurse.jl:213
  [3] transform!(ci::Core.CodeInfo, meth::Method, nargs::Int64, sparams::Core.SimpleVector, N::Int64)
    @ Diffractor ~/.julia/dev/Diffractor/src/stage1/recurse.jl:304
  [4] perform_optic_transform(ff::Type{Diffractor.∂⃖recurse{1}}, args::Any)
    @ Diffractor ~/.julia/dev/Diffractor/src/stage1/generated.jl:25
  [5] (::Core.GeneratedFunctionStub)(::Any, ::Vararg{Any})
    @ Core ./boot.jl:574
  [6] ∂⃖
    @ ~/.julia/dev/Diffractor/src/stage1/generated.jl:215 [inlined]
  [7] _dropdims
    @ ./abstractarraymath.jl:94 [inlined]
  [8] ∂⃖
    @ ~/.julia/dev/Diffractor/src/stage1/generated.jl:215 [inlined]
  [9] #dropdims#231
    @ ./abstractarraymath.jl:82 [inlined]
 [10] ∂⃖ (repeats 2 times)
    @ ~/.julia/dev/Diffractor/src/stage1/generated.jl:215 [inlined]
 [11] ∂⃖
    @ ~/.julia/dev/Diffractor/src/stage1/generated.jl:377 [inlined]
 [12] KwFunc
    @ ~/.julia/dev/Diffractor/src/stage1/generated.jl:245 [inlined]
 [13] ∂⃖
    @ ~/.julia/dev/Diffractor/src/stage1/generated.jl:215 [inlined]
 [14] #672
    @ ./REPL[290]:1 [inlined]
 [15] (::Diffractor.∂⃖recurse{1})(::var"#672#673", ::Matrix{Float64})
    @ Diffractor ~/.julia/dev/Diffractor/src/stage1/generated.jl:0
 [16] (::∂⃖{1})(f::var"#672#673", args::Matrix{Float64})
    @ Diffractor ~/.julia/dev/Diffractor/src/stage1/generated.jl:215
 [17] ∂⃖
    @ ~/.julia/dev/Diffractor/src/interface.jl:25 [inlined]
 [18] (::Diffractor.∇{var"#672#673"})(args::Matrix{Float64})
    @ Diffractor ~/.julia/dev/Diffractor/src/interface.jl:121
 [19] Diffractor.(::Function, ::Matrix{Float64})
    @ Diffractor ~/.julia/dev/Diffractor/src/interface.jl:128

We could write a rule for this, but perhaps the failure is revealing? It persists even with

ChainRulesCore.@non_differentiable Base._sub(xs...)  # ok
ChainRulesCore.@non_differentiable Base._foldoneto(xs...)  # not really ok

backing error when nesting forward-mode

Here is a trivial example of nested forwards mode

using Diffractor
f(x) = 3x^2
g(x) = Diffractor.∂☆{1}()(Diffractor.ZeroBundle{1}(f), Diffractor.TaylorBundle{1}(x, (1.0,)))
Diffractor.∂☆{1}()(Diffractor.ZeroBundle{1}(g), Diffractor.TaylorBundle{1}(10, (1.0,)))

Here is the output from running that:

julia> Diffractor.∂☆{1}()(Diffractor.ZeroBundle{1}(g), Diffractor.TaylorBundle{1}(10, (1.0,)))
ERROR: ArgumentError: Tangent for the primal Diffractor.UniformTangent{ChainRulesCore.ZeroTangent} should be backed by a NamedTuple type, not by Tuple{ChainRulesCore.ZeroTangent}.
Stacktrace:
  [1] _backing_error(P::Type, G::Type, E::Type)
    @ ChainRulesCore ~/.julia/packages/ChainRulesCore/a4mIA/src/tangent_types/tangent.jl:62
  [2] ChainRulesCore.Tangent{Diffractor.UniformTangent{ChainRulesCore.ZeroTangent}, Tuple{ChainRulesCore.ZeroTangent}}(backing::Tuple{ChainRulesCore.ZeroTangent})
    @ ChainRulesCore ~/.julia/packages/ChainRulesCore/a4mIA/src/tangent_types/tangent.jl:36
  [3] (ChainRulesCore.Tangent{Diffractor.UniformTangent{ChainRulesCore.ZeroTangent}})(args::ChainRulesCore.ZeroTangent)
    @ ChainRulesCore ~/.julia/packages/ChainRulesCore/a4mIA/src/tangent_types/tangent.jl:48
  [4] partial(x::Diffractor.CompositeBundle{1, Diffractor.UniformTangent{ChainRulesCore.ZeroTangent}, Tuple{Diffractor.TangentBundle{1, ChainRulesCore.ZeroTangent, Diffractor.UniformTangent{ChainRulesCore.ZeroTangent}}}}, i::Int64)
    @ Diffractor ~/.julia/packages/Diffractor/HBYjZ/src/stage1/forward.jl:7
  [5] first_partial(x::Diffractor.CompositeBundle{1, Diffractor.UniformTangent{ChainRulesCore.ZeroTangent}, Tuple{Diffractor.TangentBundle{1, ChainRulesCore.ZeroTangent, Diffractor.UniformTangent{ChainRulesCore.ZeroTangent}}}})
    @ Diffractor ~/.julia/packages/Diffractor/HBYjZ/src/stage1/forward.jl:11
  [6] map
    @ ./tuple.jl:291 [inlined]
  [7] map(f::typeof(Diffractor.first_partial), t::Tuple{Diffractor.TangentBundle{1, typeof(Diffractor._TangentBundle), Diffractor.UniformTangent{ChainRulesCore.ZeroTangent}}, Diffractor.TangentBundle{1, Val{1}, Diffractor.UniformTangent{ChainRulesCore.ZeroTangent}}, Diffractor.TangentBundle{1, typeof(f), Diffractor.UniformTangent{ChainRulesCore.ZeroTangent}}, Diffractor.CompositeBundle{1, Diffractor.UniformTangent{ChainRulesCore.ZeroTangent}, Tuple{Diffractor.TangentBundle{1, ChainRulesCore.ZeroTangent, Diffractor.UniformTangent{ChainRulesCore.ZeroTangent}}}}})
    @ Base ./tuple.jl:292
  [8] (::Diffractor.∂☆internal{1})(::Diffractor.TangentBundle{1, typeof(Diffractor._TangentBundle), Diffractor.UniformTangent{ChainRulesCore.ZeroTangent}}, ::Vararg{Diffractor.AbstractTangentBundle{1}})
    @ Diffractor ~/.julia/packages/Diffractor/HBYjZ/src/stage1/forward.jl:110
  [9] (::Diffractor.∂☆{1})(::Diffractor.TangentBundle{1, typeof(Diffractor._TangentBundle), Diffractor.UniformTangent{ChainRulesCore.ZeroTangent}}, ::Vararg{Diffractor.AbstractTangentBundle{1}})
    @ Diffractor ~/.julia/packages/Diffractor/HBYjZ/src/stage1/forward.jl:139
 [10] TangentBundle
    @ ~/.julia/packages/Diffractor/HBYjZ/src/tangent.jl:251 [inlined]
 [11] (::Diffractor.∂☆recurse{1})(::Diffractor.TangentBundle{1, Type{Diffractor.TangentBundle{1, B, Diffractor.UniformTangent{ChainRulesCore.ZeroTangent}} where B}, Diffractor.UniformTangent{ChainRulesCore.NoTangent}}, ::Diffractor.TangentBundle{1, typeof(f), Diffractor.UniformTangent{ChainRulesCore.ZeroTangent}})
    @ Diffractor ~/.julia/packages/Diffractor/HBYjZ/src/stage1/recurse_fwd.jl:0
 [12] (::Diffractor.∂☆internal{1})(::Diffractor.TangentBundle{1, Type{Diffractor.TangentBundle{1, B, Diffractor.UniformTangent{ChainRulesCore.ZeroTangent}} where B}, Diffractor.UniformTangent{ChainRulesCore.NoTangent}}, ::Vararg{Diffractor.AbstractTangentBundle{1}})
    @ Diffractor ~/.julia/packages/Diffractor/HBYjZ/src/stage1/forward.jl:112
 [13] (::Diffractor.∂☆{1})(::Diffractor.TangentBundle{1, Type{Diffractor.TangentBundle{1, B, Diffractor.UniformTangent{ChainRulesCore.ZeroTangent}} where B}, Diffractor.UniformTangent{ChainRulesCore.NoTangent}}, ::Vararg{Diffractor.AbstractTangentBundle{1}})
    @ Diffractor ~/.julia/packages/Diffractor/HBYjZ/src/stage1/forward.jl:139
 [14] g
    @ ~/JuliaEnvs/DAECompiler.jl/scratch/jac_scratch.jl:54 [inlined]
 [15] (::Diffractor.∂☆recurse{1})(::Diffractor.TangentBundle{1, typeof(g), Diffractor.UniformTangent{ChainRulesCore.ZeroTangent}}, ::Diffractor.TangentBundle{1, Int64, Diffractor.TaylorTangent{Tuple{Float64}}})
    @ Diffractor ~/.julia/packages/Diffractor/HBYjZ/src/stage1/recurse_fwd.jl:0
 [16] (::Diffractor.∂☆internal{1})(::Diffractor.TangentBundle{1, typeof(g), Diffractor.UniformTangent{ChainRulesCore.ZeroTangent}}, ::Vararg{Diffractor.AbstractTangentBundle{1}})
    @ Diffractor ~/.julia/packages/Diffractor/HBYjZ/src/stage1/forward.jl:112
 [17] (::Diffractor.∂☆{1})(::Diffractor.TangentBundle{1, typeof(g), Diffractor.UniformTangent{ChainRulesCore.ZeroTangent}}, ::Vararg{Diffractor.AbstractTangentBundle{1}})
    @ Diffractor ~/.julia/packages/Diffractor/HBYjZ/src/stage1/forward.jl:139
 [18] top-level scope
    @ ~/JuliaEnvs/DAECompiler.jl/scratch/jac_scratch.jl:55

I am not sure how we should handle this.
I suspect it is possible to rewrite some (maybe all?) case to do ∂☆{2}
but I would need to do some thinking.

we definately shouldn't be just erroring though.

Failure with ComposedFunction `∘`

I'm pretty confident this worked in September, but have no idea whether changes here or in ChainRules broke it: Edit -- the change is JuliaDiff/ChainRulesCore.jl#495, discussed there.

julia> Diffractor.gradient(cbrt, 1.23)
(0.29036348772107673,)

julia> Diffractor.gradient(identitycbrt, 1.23)
ERROR: ArgumentError: Tangent for the primal Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}} should be backed by a AbstractDict type, not by NamedTuple{(:data,), Tuple{ChainRulesCore.ZeroTangent}}.
Stacktrace:
  [1] _backing_error(P::Type, G::Type, E::Type)
    @ ChainRulesCore ~/.julia/packages/ChainRulesCore/qzYOG/src/tangent_types/tangent.jl:62
  [2] ChainRulesCore.Tangent{Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}, NamedTuple{(:data,), Tuple{ChainRulesCore.ZeroTangent}}}(backing::NamedTuple{(:data,), Tuple{ChainRulesCore.ZeroTangent}})
    @ ChainRulesCore ~/.julia/packages/ChainRulesCore/qzYOG/src/tangent_types/tangent.jl:33
  [3] (::Diffractor.var"#162#164"{Symbol, DataType})(Δ::ChainRulesCore.ZeroTangent)
    @ Diffractor ~/.julia/packages/Diffractor/HYuxt/src/stage1/generated.jl:308
  [4] (::Diffractor.EvenOddOdd{1, 1, Diffractor.var"#162#164"{Symbol, DataType}, Diffractor.var"#163#165"{Symbol}})(Δ::ChainRulesCore.ZeroTangent)
    @ Diffractor ~/.julia/packages/Diffractor/HYuxt/src/stage1/generated.jl:288
  [5] ∂⃖¹₁merge
    @ ./none:1
  [6] ∂⃖¹₁
    @ ./none:1
  [7] (::Diffractor.ApplyOdd{1, 1})(Δ::Float64)
    @ Diffractor ~/.julia/packages/Diffractor/HYuxt/src/stage1/generated.jl:371
  [8] ∂⃖¹₁ComposedFunction
    @ ./none:1
  [9] (::Diffractor.∇{ComposedFunction{typeof(identity), typeof(cbrt)}})(args::Float64)
    @ Diffractor ~/.julia/packages/Diffractor/HYuxt/src/interface.jl:122
 [10] Diffractor.(::Function, ::Float64)
    @ Diffractor ~/.julia/packages/Diffractor/HYuxt/src/interface.jl:128
 [11] top-level scope
    @ REPL[3]:1

Forward mode with complex numbers

These should agree, right?

julia> Diffractor.PrimeDerivativeBack(x -> real(sin(x)))(1+im)
0.8337300251311491 + 0.9888977057628651im

julia> Diffractor.PrimeDerivativeFwd(x -> real(sin(x)))(1+im)
ERROR: Tangent space not defined for `Complex{Int64}.

Better show method for zero bundles

Diffractor.TangentBundle{1, typeof(getfield), Diffractor.UniformTangent{ChainRulesCore.ZeroTangent}}(getfield, Diffractor.UniformTangent{ChainRulesCore.ZeroTangent}(ChainRulesCore.ZeroTangent()))

is uselessly verbose

custom AD transform plugin: a use case from YaoCompiler

this is just one thing I was waiting Diffractor to think about, it comes from the idea of integrating Diffractor and YaoCompiler, I'll leave a description of what we want to do on this here, I assume this could be similar for GPUs too, since the design of SSA IR can be seen as MLIR regions too.

in a quantum program, there are certain AD rules that makes use of the reversibility and (quantum specific) algebra rules, such as Pauli operators etc.
As a result there are three ways of doing AD on a quantum program

  1. parameter shift rule: forward mode AD, works on hardware, but on pure quantum circuits that satisfy the Pauli operator permutation rule
  2. reverse mode using circuit reverisbility: reverse mode, but only runs on simulator, the register (quantum state vector) will always be mutated to save memory (thus has side effect)
  3. transform the circuit to an integral of channel: forward mode, works on hardware, works on broader set of operators, but will create a hybrid program (may contain classical control flow)

these can be implemented on a pure quantum program without touching compiler transform, which we can do in short term, but
the problem becomes different when the program contains classical control flows (the long term goal),
which is one feature we aim to support in YaoCompiler - compile a general Julia program to quantum hardware,
in YaoCompiler, we extend Julia SSA IR to contain two new elements in the SSA IR:

  1. a quantum block: like basic block, but only contains quantum instructions/statements, this block is always inside a basic block
  2. a quantum terminator: statement that jump from a quantum block to a basic block

(I wish I could post a picture of this, but I don't have one at the moment)

so then the idea is to tell an AD compiler like Diffractor to know how to transform a piece of program like this using a plugin, or re-use Diffractor's transformation in another compiler plugin for the irrelavent part (for the quantum case, is to run Diffractor on classical statements, but run a custom pass on the quantum components above) but I'm not sure if this is the best way of doing it. And I think with the work of julia compiler plugin there would be more cases like this that someone may want to have a custom AD plugin used by Diffractor when differentiating under their compiler plugin context.

I guess some other folks might be interested in this, cc: @aviatesk @femtomc

Teach Forward Mode about Effects and Zeros

In forwards mode, if you can prove code is effect free, then it is safe to not execute it if you are just pushing forward zeros & you don't need the primal result.
There might be some other things we can do with effects here too.

Probably worth cross-referencing what is happening in Dex-land with effects.

Diffractor's forward mode cannot differentiate a struct with extra type params

MWE:

julia> struct Goo{I,T}
           x::T
       end

julia> using Diffractor

julia> const var"'" = Diffractor.PrimeDerivativeFwd
Diffractor.PrimeDerivativeFwd

julia> goo(x) = Goo{Int,typeof(x)}(x).x
goo (generic function with 1 method)

julia> goo'(1)
ERROR: MethodError: no method matching apply_type(::Type{Goo}, ::Type{Int64}, ::Type{Int64})
Stacktrace:
  [1] ∂☆nomethd(args::Any)
    @ Diffractor ~/src/julia/Diffractor/src/stage1/recurse_fwd.jl:115
  [2] macro expansion
    @ ~/src/julia/Diffractor/src/stage1/recurse_fwd.jl:0 [inlined]
  [3] (::Diffractor.∂☆recurse{1})(::Diffractor.UniformBu, ::Diffractor.UniformBu, ::Diffractor.UniformBu, ::Diffractor.UniformBu)
    @ Diffractor ~/src/julia/Diffractor/src/stage1/recurse_fwd.jl:160
  [4] (::Diffractor.∂☆internal{1})(::Diffractor.UniformBu, ::Vararg{Diffractor.Ab)
    @ Diffractor ~/src/julia/Diffractor/src/stage1/forward.jl:115
  [5] (::Diffractor.∂☆{1})(::Diffractor.UniformBu, ::Vararg{Diffractor.Ab)
    @ Diffractor ~/src/julia/Diffractor/src/stage1/forward.jl:134
  [6] ∂☆recurse
    @ ./REPL[4]:1 [inlined]
  [7] (::Diffractor.∂☆recurse{1})(::Diffractor.UniformBu, ::Diffractor.TangentBu)
    @ Diffractor ~/src/julia/Diffractor/src/stage1/recurse_fwd.jl:0
  [8] (::Diffractor.∂☆internal{1})(::Diffractor.UniformBu, ::Vararg{Diffractor.Ab)
    @ Diffractor ~/src/julia/Diffractor/src/stage1/forward.jl:115
  [9] ∂☆
    @ ~/src/julia/Diffractor/src/stage1/forward.jl:134 [inlined]
 [10] (::Diffractor.PrimeDerivativeFwd{1, typeof(goo)})(x::Int64)
    @ Diffractor ~/src/julia/Diffractor/src/interface.jl:177
 [11] top-level scope
    @ REPL[5]:1

Strange behaviour of the derivative

I do not know, maybe I am missing something simple here

using Diffractor
using Plots

function f1(x)
    res = 0.0
    for i in 1:10
        res += x^i
    end
    res
end

f(x) = let var"'" = Diffractor.PrimeDerivativeFwd 
    f1'(x)
end


xs = -0.5:0.01:0.5
plot(xs, f1.(xs), legend = nothing)
plot!(xs, f.(xs))

It has this weird glitch in zero.

flameshot-2021-07-30T12-23-07

Info:

julia> versioninfo()
Julia Version 1.8.0-DEV.74
Commit ba5fffca7b (2021-06-24 02:52 UTC)
Platform Info:
  OS: Linux (x86_64-linux-gnu)
  CPU: Intel(R) Core(TM) i7-7700HQ CPU @ 2.80GHz
  WORD_SIZE: 64
  LIBM: libopenlibm
  LLVM: libLLVM-12.0.0 (ORCJIT, skylake)
Environment:
  JULIA_NUM_THREADS = 4
  JULIA_PKG_SERVER =

(Diffractor) pkg> st
      Status `~/Projects/BabySteps/Diffractor/Project.toml`
  [d360d2e6] ChainRulesCore v1.0.2
  [9f5e2b26] Diffractor v0.1.1 `~/.julia/dev/Diffractor`
  [91a5bcdd] Plots v1.19.4

Very poor performance on simple taylor function

In Chris' recent SciML video, at this timestamp he showcases a little AD benchmark. Summarizing, differentiating the function:

function taylor(x, N)
    sum = 0 * x
    for i = 1:N
        sum += x^i / i
    end
    return sum
end

Shows pretty poor performance in Diffractor. I've created a gist that contains a Project, Manifest and test script to showcase the issue. You can increase N to get harder and harder problems. Note that I was unable to get Enzyme working on master (perhaps I need to check out the master branch) and Diffractor dies with a stack overflow if N is too large.

For N=10^4, we get timings of Enzyme's code running in <1ms, and Diffractor taking more than 300ms. Zygote and ForwardDiff are both well under 10ms.

Reverse mode sometimes segfaults Julia

MWE:

julia> using Diffractor

julia> foo(x) = cos(x) * sin(x)
foo (generic function with 1 method)

julia> Diffractor.PrimeDerivativeBack(Diffractor.PrimeDerivativeBack(Diffractor.PrimeDerivativeBack(foo)))(1.0)
Internal error: encountered unexpected error in runtime:
BoundsError(a=Array{Core.SSAValue, (6,)}[SSAValue(1), SSAValue(2), SSAValue(3), SSAValue(4), SSAValue(18446744073709551615), SSAValue(18446744073709551615)], i=(-9223372036854775786,))
jl_bounds_error_ints at /Users/scheme/.bin/julia/src/rtutils.c:194
getindex at ./array.jl:835 [inlined]
renumber_ssa at ./compiler/ssair/slot2ssa.jl:65
renumber_ssa! at ./compiler/ssair/slot2ssa.jl:71 [inlined]
renumber_ssa! at ./compiler/ssair/slot2ssa.jl:71
_jl_invoke at /Users/scheme/.bin/julia/src/gf.c:0 [inlined]
jl_apply_generic at /Users/scheme/.bin/julia/src/gf.c:2427

Error with high order derivative with Flux.chain and DiffEqFlux.FastChain

using Flux, DiffEqFlux
using Diffractor: var"'", ∂⃖

chain = FastChain(FastDense(1,12,Flux.σ),FastDense(12,1),(x,p) ->x[1])
initθ = (DiffEqFlux.initial_params(chain))
chain_f = (x) -> chain([x], initθ)
chain_f(1.)
chain_f'(1.)
chain_f''(1.)
Internal error: encountered unexpected error in runtime:
BoundsError(a=Array{Any, (0,)}[], i=(1,))

chain = Flux.Chain(Dense(1,12,Flux.σ),Dense(12,1),(x) ->x[1])
chain([1.])
chain_f = (x) -> chain([x])
chain_f(1.)
chain_f'(1.)

chain_f''(1.)
ERROR: MethodError: no method matching +(::Tuple{ChainRulesCore.Tangent{ChainRules.var"#1201#1203"{Vector{Float64}, Tuple{ChainRulesCore.ProjectTo{Float64, NamedTuple{(), Tuple{}}}}}, NamedTuple{(, :projects), Tuple{Vector{Float64}, Tuple{ChainRulesCore.ZeroTangent}}}}, ChainRulesCore.NoTangent})
Closest candidates are:
  +(::P, ::ChainRulesCore.Tangent{P}) where P at ~/.julia/packages/ChainRulesCore/7ZiwT/src/tangent_arithmetic.jl:133
  +(::Any, ::ChainRulesCore.AbstractThunk) at ~/.julia/packages/ChainRulesCore/7ZiwT/src/tangent_arithmetic.jl:123
  +(::Any, ::Union{InitialValues.NonspecificInitialValue, InitialValues.SpecificInitialValue{typeof(+)}}) at ~/.julia/packages/InitialValues/P5PLf/src/InitialValues.jl:160
  julia> versioninfo()
  Julia Version 1.7.0-rc3
  Commit 3348de4ea6 (2021-11-15 08:22 UTC)
  Platform Info:
    OS: macOS (x86_64-apple-darwin19.6.0)
    CPU: Intel(R) Core(TM) i5-1038NG7 CPU @ 2.00GHz
    WORD_SIZE: 64
    LIBM: libopenlibm
    LLVM: libLLVM-12.0.1 (ORCJIT, icelake-client)
  Environment:
    JULIA_NUM_THREADS = 4

Broken on nightly due to type change

From the tests

julia> using TestEnv; TestEnv.activate(); using Symbolics, Diffractor

julia> @variables ω α β γ δ ϵ ζ η
       (x1, c1) = ∂⃖{3}()(exp, ω)
ERROR: TypeError: in opaque_closure_method, expected Int64, got a value of type Int32

Some of the types have been truncated in the stacktrace for improved reading. To emit complete information
in the stack trace, evaluate `TruncatedStacktraces.VERBOSE[] = true` and re-run the code.

Stacktrace:
 [1] ∂⃖
   @ ~/JuliaEnvs/Diffractor.jl/src/stage1/generated.jl:223 [inlined]
 [2] (::∂⃖{3})(f::typeof(exp), args::Num)
   @ Diffractor ~/JuliaEnvs/Diffractor.jl/src/stage1/generated.jl:66
 [3] top-level scope
   @ REPL[38]:2

Nightly first failed with this on
22 Feb https://github.com/JuliaDiff/Diffractor.jl/actions/runs/4229848232
and last worked on
21 Feb https://github.com/JuliaDiff/Diffractor.jl/actions/runs/4229848232

So there is a narrow window in what change could have caused this failure

Mark nightly jobs as allowed to fail

Julia nightly is a rapidly moving target; we should mark nightly jobs as allowed to fail so that we don't have CI that is constantly failing.

Accumulation

At present Diffractor may return thunks, but doesn't seem use them (or anything else) to accumulate effciently:

julia> @btime gradient(x -> sum(x), $(rand(100, 100))) |> first |> typeof
  min 1.829 μs, mean 1.869 μs (8 allocations, 368 bytes)
ChainRulesCore.InplaceableThunk{ChainRulesCore.Thunk...

julia> @btime gradient(x -> sum(x) + sum(x), $(rand(100, 100))) |> first |> typeof
  min 8.917 μs, mean 28.032 μs (22 allocations, 235.03 KiB)
Matrix{Float64} (alias for Array{Float64, 2})

julia> @btime copy($(rand(100, 100)));
  min 1.262 μs, mean 4.120 μs (2 allocations, 78.17 KiB)

julia> 235.03 / 78.17
3.00665216835103

Should this change, perhaps to use ChainRulesCore.add!!? In which case it might it be easiest to change now, while there is nothing downstream to break.

Note aside that add!! is slower than expected here, 3 copies not 1:

julia> g1 = gradient(x -> sum(x), (rand(100, 100)))[1]; g2 = deepcopy(g1);

julia> @btime ChainRulesCore.add!!($(g1), $(g2));
  min 4.719 μs, mean 21.300 μs (8 allocations, 234.55 KiB)

julia> 234.55 / 78.17
3.0005117052577717

julia> @btime ChainRulesCore.add!!($(randn(100, 100)), $(g2));
  min 3.318 μs, mean 3.438 μs (0 allocations)

Or do we count on ImmutableArray and compiler improvements?

Xref FluxML/Zygote.jl#981 which retrofits Zygote to accumulate in-place.

Xref also JuliaDiff/ChainRulesCore.jl#539 which alters + on two thunks to be more efficient.

No method matching adjoint(::typeof(f))

The first thing I try with Diffractor blows up.

Is there something I'm missing about how to use it?

julia> using Diffractor

julia> f(x) = 4*x + 5
f (generic function with 1 method)

julia> f(10)
45

julia> f'(10)
ERROR: MethodError: no method matching adjoint(::typeof(f))
Closest candidates are:
  adjoint(::Union{LinearAlgebra.QR, LinearAlgebra.QRCompactWY, LinearAlgebra.QRPivoted}) at C:\dd\Julia 1.7.2\share\julia\stdlib\v1.7\LinearAlgebra\src\qr.jl:509
  adjoint(::Union{LinearAlgebra.Cholesky, LinearAlgebra.CholeskyPivoted}) at C:\dd\Julia 1.7.2\share\julia\stdlib\v1.7\LinearAlgebra\src\cholesky.jl:538
  adjoint(::ChainRulesCore.AbstractZero) at C:\Users\niklasg\.julia\packages\ChainRulesCore\IzITE\src\tangent_types\abstract_zero.jl:23
  ...
Stacktrace:
 [1] top-level scope
   @ REPL[4]:1

It works fine with Zygote:

julia> using Zygote

julia> f(x) = 4*x + 5
f (generic function with 1 method)

julia> f(10)
45

julia> f'(10)
4.0

Cannot compute nested gradients with reverse mode AD

The following fails in Diffractor (works in Zygote but only in mixed mode):

julia> using Diffractor: var"'"

julia> A = randn(3,3);

julia> f(x) = sum(A*x);

julia> g(x) = sum(x .* f'(x));

julia> g'(ones(3))
ERROR: MethodError: no method matching _adjoint_vec_pullback(::ChainRulesCore.ZeroTangent)
Closest candidates are:
  _adjoint_vec_pullback(::ChainRulesCore.Tangent) at /home/bj0rn/.julia/packages/ChainRules/xaVoS/src/rulesets/LinearAlgebra/structured.jl:119
  _adjoint_vec_pullback(::AbstractMatrix) at /home/bj0rn/.julia/packages/ChainRules/xaVoS/src/rulesets/LinearAlgebra/structured.jl:120
  _adjoint_vec_pullback(::ChainRulesCore.AbstractThunk) at /home/bj0rn/.julia/packages/ChainRules/xaVoS/src/rulesets/LinearAlgebra/structured.jl:121
Stacktrace:
 [1] ∂⃖¹₁times_pullback
   @ ./none:1
 [2] (::Diffractor.∂⃖rruleB{1, 1})(::ChainRulesCore.ZeroTangent, ::Vararg{Any})
   @ Diffractor ~/.julia/packages/Diffractor/EwwgG/src/stage1/generated.jl:64
 [3] ∂⃖²₂f
   @ ./none:1
 [4] (::Diffractor.∂⃖weaveInnerOdd{1, 1})(Δ::Tuple{ChainRulesCore.ZeroTangent, Vector{Float64}})
   @ Diffractor ~/.julia/packages/Diffractor/EwwgG/src/stage1/generated.jl:64
 [5] ∂⃖¹₁PrimeDerivativeBack
   @ ./none:1
 [6] ∂⃖¹₁g
   @ ./none:1
 [7] (::Diffractor.PrimeDerivativeBack{1, typeof(g)})(x::Vector{Float64})
   @ Diffractor ~/.julia/packages/Diffractor/EwwgG/src/interface.jl:160
 [8] top-level scope
   @ REPL[5]:1

Support mutation for forward mode

We currently have partial support for mutation in forwards mode, (I think)
I believe mutating arrays works, but mutating structs does not.

We need two things for this:

  • JuliaDiff/ChainRulesCore.jl#105 a MutableTangent type
  • To never automatically insert ZeroTangent for mutable structs, and instead us a MutableTangent with no fields/all fields set to ZeroTangents (unless those fields themselves are mutable structs)

Mutation in forwards mode is much easier than reverse.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.