repo_name
stringlengths
1
62
dataset
stringclasses
1 value
lang
stringclasses
11 values
pr_id
int64
1
20.1k
owner
stringlengths
2
34
reviewer
stringlengths
2
39
diff_hunk
stringlengths
15
262k
code_review_comment
stringlengths
1
99.6k
Mooncake.jl
github_2023
others
525
compintell
willtebbutt
@@ -169,8 +205,86 @@ struct Cache{Trule,Ty_cache,Ttangents<:Tuple} tangents::Ttangents end -_copy!!(dst, src) = copy!(dst, src) -_copy!!(::Number, src::Number) = src +const supportedcollections = Union{Tuple,NamedTuple} + +""" + is_user_defined_struct(T) + +Required for checking if datatype `T` is a immutabl...
Would it be easier just to add a method of `__exclude_unsupported_output_internal!` that explicitly handles `Tuple`s and `NamedTuple`s? i.e. something like ```julia function __exclude_unsupported_output_internal!(y::Union{Tuple,NamedTuple}, address_set::Set{UInt}) map(Base.Fix2(__exclude_unsupported_output_inter...
Mooncake.jl
github_2023
others
525
compintell
willtebbutt
@@ -169,8 +205,86 @@ struct Cache{Trule,Ty_cache,Ttangents<:Tuple} tangents::Ttangents end -_copy!!(dst, src) = copy!(dst, src) -_copy!!(::Number, src::Number) = src +const supportedcollections = Union{Tuple,NamedTuple}
```suggestion ``` See below.
Mooncake.jl
github_2023
others
525
compintell
willtebbutt
@@ -47,6 +47,39 @@ function throw_val_and_grad_ret_type_error(y) ) end +function throw_forward_ret_type_error(y) + throw( + ValueAndGradientReturnTypeError( + "Found a value of type $(typeof(y)) in output, but output is not permitted to be or contain a pointer. This is because the amount of...
```suggestion ``` I think we can get rid of this. See below.
Mooncake.jl
github_2023
others
525
compintell
willtebbutt
@@ -47,6 +47,39 @@ function throw_val_and_grad_ret_type_error(y) ) end +function throw_forward_ret_type_error(y) + throw( + ValueAndGradientReturnTypeError( + "Found a value of type $(typeof(y)) in output, but output is not permitted to be or contain a pointer. This is because the amount of...
```suggestion ``` I think we can also get rid of this. See below
Mooncake.jl
github_2023
others
525
compintell
willtebbutt
@@ -169,8 +205,86 @@ struct Cache{Trule,Ty_cache,Ttangents<:Tuple} tangents::Ttangents end -_copy!!(dst, src) = copy!(dst, src) -_copy!!(::Number, src::Number) = src +const supportedcollections = Union{Tuple,NamedTuple} + +""" + is_user_defined_struct(T) + +Required for checking if datatype `T` is a immutabl...
Is there a particular problem that arises if the output contains unassigned elements?
Mooncake.jl
github_2023
others
525
compintell
willtebbutt
@@ -169,8 +205,86 @@ struct Cache{Trule,Ty_cache,Ttangents<:Tuple} tangents::Ttangents end -_copy!!(dst, src) = copy!(dst, src) -_copy!!(::Number, src::Number) = src +const supportedcollections = Union{Tuple,NamedTuple} + +""" + is_user_defined_struct(T) + +Required for checking if datatype `T` is a immutabl...
Ah, also, we only need to add items to the `address_set` if their identity is given by their address in memory, so we can restrict this to only run if `ismutabletype(T)` is `true` I think.
Mooncake.jl
github_2023
others
525
compintell
willtebbutt
@@ -98,4 +98,81 @@ end end end end + + @testset "prepare_pullback_cache errors" begin + # Test when function outputs a valid type. + struct userdefinedstruct
```suggestion struct UserDefinedStruct ``` style
Mooncake.jl
github_2023
others
525
compintell
willtebbutt
@@ -98,4 +98,81 @@ end end end end + + @testset "prepare_pullback_cache errors" begin + # Test when function outputs a valid type. + struct userdefinedstruct + a::Int64 + b::Vector{Float64} + c::Vector{Vector{Float64}} + end + + ...
```suggestion mutable struct UserDefinedMutableStruct ``` style
Mooncake.jl
github_2023
others
525
compintell
willtebbutt
@@ -98,4 +98,81 @@ end end end end + + @testset "prepare_pullback_cache errors" begin + # Test when function outputs a valid type. + struct userdefinedstruct + a::Int64 + b::Vector{Float64} + c::Vector{Vector{Float64}} + end + + ...
```suggestion test_to_pass_cases = [ ```
Mooncake.jl
github_2023
others
525
compintell
willtebbutt
@@ -47,6 +47,23 @@ function throw_val_and_grad_ret_type_error(y) ) end +function throw_forward_ret_type_error(y) + throw( + ValueAndGradientReturnTypeError(
```suggestion ValueAndPullbackReturnTypeError( ``` This is super picky, but I wonder whether we could have a separate error type for pullback return-type errors?
Mooncake.jl
github_2023
others
525
compintell
willtebbutt
@@ -169,8 +193,82 @@ struct Cache{Trule,Ty_cache,Ttangents<:Tuple} tangents::Ttangents end -_copy!!(dst, src) = copy!(dst, src) -_copy!!(::Number, src::Number) = src +""" + is_user_defined_struct(T) + +Required for checking if datatype `T` is a immutable Composite, Mutable Composite type (returns true) or a ...
(Hopefully) the last thing I'd like to discuss is this. I'm wondering whether it makes more sense to just add methods of `__exclude_unsupported_output` for `Array` and `Memory` (the latter only on 1.11)? This would mean that we could get rid of this function entirely, and just keep the loop over `fieldnames(T)` in the ...
Mooncake.jl
github_2023
others
525
compintell
willtebbutt
@@ -169,8 +193,82 @@ struct Cache{Trule,Ty_cache,Ttangents<:Tuple} tangents::Ttangents end -_copy!!(dst, src) = copy!(dst, src) -_copy!!(::Number, src::Number) = src +""" + __exclude_unsupported_output(y) + +Required for the robust design of [`value_and_pullback`](@ref), [`prepare_pullback_cache`](@ref). +...
```suggestion const __BuiltinArrays = @static VERSION >= v"1.11" ? Union{Array,Memory} : Array function __exclude_unsupported_output_internal!( y::T, address_set::Set{UInt} ) where {T<:__BuiltinArrays} ``` so sorry, would you mind if we renamed `IterableCollections`? It's just a very general name, and I thi...
Mooncake.jl
github_2023
others
514
compintell
willtebbutt
@@ -10,6 +10,7 @@ ChainRulesCore = "d360d2e6-b24c-11e9-a2a3-2a2ae2dbcce4" DiffRules = "b552c78f-8df3-52c6-915a-8e097449b14b" DiffTests = "de460e47-3fe3-5279-bb4a-814414816d5d" ExprTools = "e2ba6199-217a-4e67-a87a-7c52f15ade04" +Flux = "587475ba-b771-5e3f-ad9e-33799f191a9c"
Could we please make this a package extension, rather than making Flux a dep of Mooncake?
Mooncake.jl
github_2023
others
514
compintell
willtebbutt
@@ -33,3 +33,55 @@ end function generate_derived_rrule!!_test_cases(rng_ctor, ::Val{:linear_algebra}) return Any[], Any[] end + +@is_primitive DefaultCtx Tuple{ + typeof(Losses.mse),AbstractArray{<:IEEEFloat},AbstractArray{<:IEEEFloat} +} + +function rrule!!( + ::CoDual{typeof(Losses.mse)}, + X::CoDual{...
```suggestion function flux_mse_pullback(dloss::P) ``` style
Mooncake.jl
github_2023
others
514
compintell
willtebbutt
@@ -33,3 +33,55 @@ end function generate_derived_rrule!!_test_cases(rng_ctor, ::Val{:linear_algebra}) return Any[], Any[] end + +@is_primitive DefaultCtx Tuple{ + typeof(Losses.mse),AbstractArray{<:IEEEFloat},AbstractArray{<:IEEEFloat} +} + +function rrule!!( + ::CoDual{typeof(Losses.mse)}, + X::CoDual{...
```suggestion return zero_fcodual(Losses.mse(X.x, Y.x)), flux_mse_pullback ```
Mooncake.jl
github_2023
others
514
compintell
willtebbutt
@@ -33,3 +33,55 @@ end function generate_derived_rrule!!_test_cases(rng_ctor, ::Val{:linear_algebra}) return Any[], Any[] end + +@is_primitive DefaultCtx Tuple{ + typeof(Losses.mse),AbstractArray{<:IEEEFloat},AbstractArray{<:IEEEFloat} +} + +function rrule!!( + ::CoDual{typeof(Losses.mse)}, + X::CoDual{...
```suggestion N = P(2) / P(length(X.x)) .* (X.x .- Y.x) ``` Two things here: 1. could we please use a variable name other than `N`? It's typically used for counts and lengths of things 2. per the suggestion, I think we probably want to fuse the broadcasting here to avoid extra allocations
Mooncake.jl
github_2023
others
514
compintell
willtebbutt
@@ -33,3 +33,55 @@ end function generate_derived_rrule!!_test_cases(rng_ctor, ::Val{:linear_algebra}) return Any[], Any[] end + +@is_primitive DefaultCtx Tuple{ + typeof(Losses.mse),AbstractArray{<:IEEEFloat},AbstractArray{<:IEEEFloat} +} + +function rrule!!( + ::CoDual{typeof(Losses.mse)}, + X::CoDual{...
```suggestion X::CoDual{<:Array{P}}, Y::CoDual{<:Array{P}}, ``` the convention in Mooncake is to provide very strict typing for rules. I'm realising now that I don't have the rationale for this written down anywhere -- in short it's because if you write rules for abstract types, they'll usually be subtly wr...
Mooncake.jl
github_2023
others
514
compintell
willtebbutt
@@ -33,3 +33,55 @@ end function generate_derived_rrule!!_test_cases(rng_ctor, ::Val{:linear_algebra}) return Any[], Any[] end + +@is_primitive DefaultCtx Tuple{ + typeof(Losses.mse),AbstractArray{<:IEEEFloat},AbstractArray{<:IEEEFloat} +} + +function rrule!!( + ::CoDual{typeof(Losses.mse)}, + X::CoDual{...
Could you explain why such small numbers are being used here? Rather than e.g. just sampling them via `randn`?
Mooncake.jl
github_2023
others
514
compintell
willtebbutt
@@ -0,0 +1,40 @@ +using Pkg +Pkg.activate(@__DIR__) +Pkg.develop(; path=joinpath(@__DIR__, "..", "..", "..")) + +using Mooncake, StableRNGs, Test +using Flux: Losses +using Mooncake.TestUtils: test_rule + +@testset "flux" begin + @testset "$f, $(typeof(fargs))" for ( + interface_only, perf_flag, is_primitive,...
```suggestion false, :none, true, Losses.mse, randn(StableRNG(1), P, 3), randn(StableRNG(2), P, 3)) ``` Since this is now a primitive, we should tweak the corresponding flag to ensure that the `test_rule` checks to see whether it's hitting the primitive. The same applies to the other test cases in th...
Mooncake.jl
github_2023
others
514
compintell
willtebbutt
@@ -0,0 +1,29 @@ +module MooncakeFluxExt + +using Mooncake, Flux +using Base: IEEEFloat +import Mooncake: + DefaultCtx, rrule!!, @is_primitive, @mooncake_overlay, CoDual, zero_fcodual, NoRData
```suggestion DefaultCtx, rrule!!, @is_primitive, CoDual, zero_fcodual, NoRData ``` Unless I'm mistaken, mooncake_overlays aren't being used here.
Mooncake.jl
github_2023
others
514
compintell
willtebbutt
@@ -0,0 +1,28 @@ +module MooncakeFluxExt + +using Mooncake, Flux +using Base: IEEEFloat +import Mooncake: DefaultCtx, rrule!!, @is_primitive, CoDual, zero_fcodual, NoRData + +@is_primitive DefaultCtx Tuple{ + typeof(Flux.Losses.mse),Array{P},Array{P} +} where {P<:IEEEFloat} + +function rrule!!( + ::CoDual{typeof(...
```suggestion ``` I think this comment can go, since the line below it no longer contains a `NoFData` directly.
Mooncake.jl
github_2023
others
514
compintell
willtebbutt
@@ -0,0 +1,29 @@ +using Pkg +Pkg.activate(@__DIR__) +Pkg.develop(; path=joinpath(@__DIR__, "..", "..", "..")) + +using Mooncake, StableRNGs, Test, Flux +using Mooncake.TestUtils: test_rule + +@testset "flux" begin + @testset "$f, $(typeof(fargs))" for ( + interface_only, perf_flag, is_primitive, f, fargs... +...
```suggestion ``` Can this comment also go now that we're not changing how we choose the input vector depending on the precision?
Mooncake.jl
github_2023
others
514
compintell
willtebbutt
@@ -0,0 +1,29 @@ +using Pkg +Pkg.activate(@__DIR__) +Pkg.develop(; path=joinpath(@__DIR__, "..", "..", "..")) + +using Mooncake, StableRNGs, Test, Flux +using Mooncake.TestUtils: test_rule + +@testset "flux" begin + @testset "$f, $(typeof(fargs))" for ( + interface_only, perf_flag, is_primitive, f, fargs... +...
```suggestion true, ``` I think maybe this still needs to be changed to `true`?
Mooncake.jl
github_2023
others
514
compintell
willtebbutt
@@ -0,0 +1,29 @@ +using Pkg +Pkg.activate(@__DIR__) +Pkg.develop(; path=joinpath(@__DIR__, "..", "..", "..")) + +using Mooncake, StableRNGs, Test, Flux +using Mooncake.TestUtils: test_rule + +@testset "flux" begin + @testset "$f, $(typeof(fargs))" for ( + interface_only, perf_flag, is_primitive, f, fargs... +...
```suggestion :stability, ``` Could we please change this flag so that we always check that the code is type-stable? (it definitely looks like it ought to be, but best to be sure).
Mooncake.jl
github_2023
others
514
compintell
willtebbutt
@@ -1,3 +1,3 @@ @testset "linear_algebra" begin TestUtils.run_rrule!!_test_cases(StableRNG, Val(:linear_algebra)) -end +end
```suggestion end ``` Do you know what has changed here? Would be good to get this out of the diff.
Mooncake.jl
github_2023
others
514
compintell
willtebbutt
@@ -0,0 +1,28 @@ +module MooncakeFluxExt + +using Mooncake, Flux +using Base: IEEEFloat +import Mooncake: DefaultCtx, rrule!!, @is_primitive, CoDual, zero_fcodual, NoRData + +@is_primitive DefaultCtx Tuple{ + typeof(Flux.Losses.mse),Array{P},Array{P} +} where {P<:IEEEFloat} + +function rrule!!( + ::CoDual{typeof(...
```suggestion end ``` I think we maybe need a newline at the end of this file to make github happy.
Mooncake.jl
github_2023
others
514
compintell
willtebbutt
@@ -49,6 +51,7 @@ ChainRulesCore = "1" DiffRules = "1" DiffTests = "0.1" ExprTools = "0.1" +Flux = "0.15.2"
I just dev-ed Flux locally, and it looks like there's a new version, 0.16.3 available. Is there a reason we're pinning ourselves to an old patch version of Flux here, or could we update to the latest version?
Mooncake.jl
github_2023
others
514
compintell
willtebbutt
@@ -0,0 +1,28 @@ +module MooncakeFluxExt + +using Mooncake, Flux +using Base: IEEEFloat +import Mooncake: DefaultCtx, rrule!!, @is_primitive, CoDual, zero_fcodual, NoRData + +@is_primitive DefaultCtx Tuple{ + typeof(Flux.Losses.mse),Array{P},Array{P} +} where {P<:IEEEFloat} + +function rrule!!( + ::CoDual{typeof(...
```suggestion tmp = dloss * P(2 / length(X.x)) @inbounds for n in eachindex(X.x) d = X.x[n] - Y.x[n] X.dx[n] += tmp * d Y.dx[n] -= tmp * d end ``` I did some benchmarking locally, and I think this might be the best way to implement the reverse-pass. It...
Mooncake.jl
github_2023
others
514
compintell
willtebbutt
@@ -0,0 +1,31 @@ +module MooncakeFluxExt + +using Mooncake, Flux +using Base: IEEEFloat +import Mooncake: DefaultCtx, rrule!!, @is_primitive, CoDual, zero_fcodual, NoRData + +@is_primitive DefaultCtx Tuple{ + typeof(Flux.Losses.mse),Array{P},Array{P} +} where {P<:IEEEFloat} + +function rrule!!( + ::CoDual{typeof(...
```suggestion ```
Mooncake.jl
github_2023
others
514
compintell
yebai
@@ -0,0 +1,29 @@ +module MooncakeFluxExt + +using Mooncake, Flux +using Base: IEEEFloat +import Mooncake: DefaultCtx, rrule!!, @is_primitive, CoDual, zero_fcodual, NoRData + +@is_primitive DefaultCtx Tuple{ + typeof(Flux.Losses.mse),Array{P},Array{P} +} where {P<:IEEEFloat} + +function rrule!!(
```suggestion # This is a performance-specific rule motivated by https://github.com/compintell/Mooncake.jl/issues/466 function rrule!!( ```
Mooncake.jl
github_2023
others
512
compintell
yebai
@@ -1,7 +1,47 @@ -# See the docstring for `BBCode` for some context on this file. +""" + module BBCode
Gemini [suggests](https://g.co/gemini/share/261dc2571263) ```suggestion module BasicBlockIR ```
Mooncake.jl
github_2023
others
512
compintell
yebai
@@ -1,7 +1,72 @@ -# See the docstring for `BBCode` for some context on this file. +""" + module BBCode
```suggestion module BasicBlockCode ```
Mooncake.jl
github_2023
others
477
compintell
willtebbutt
@@ -141,33 +141,27 @@ include("interface.jl") include("config.jl") include("developer_tools.jl") -export primal, - tangent, - randn_tangent, - increment!!, - NoTangent, - Tangent, - MutableTangent, - PossiblyUninitTangent, - set_to_zero!!, - tangent_type, - zero_tangent, - _scale, -...
```suggestion ```
Mooncake.jl
github_2023
others
477
compintell
willtebbutt
@@ -141,33 +141,27 @@ include("interface.jl") include("config.jl") include("developer_tools.jl") -export primal, - tangent, - randn_tangent, - increment!!, - NoTangent, - Tangent, - MutableTangent, - PossiblyUninitTangent, - set_to_zero!!, - tangent_type, - zero_tangent, - _scale, -...
```suggestion export tangent_type ```
Mooncake.jl
github_2023
others
477
compintell
willtebbutt
@@ -141,33 +141,27 @@ include("interface.jl") include("config.jl") include("developer_tools.jl") -export primal, - tangent, - randn_tangent, - increment!!, - NoTangent, - Tangent, - MutableTangent, - PossiblyUninitTangent, - set_to_zero!!, - tangent_type, - zero_tangent, - _scale, -...
```suggestion @public set_to_zero!!, increment!! ```
Mooncake.jl
github_2023
others
477
compintell
willtebbutt
@@ -141,33 +141,27 @@ include("interface.jl") include("config.jl") include("developer_tools.jl") -export primal, - tangent, - randn_tangent, - increment!!, - NoTangent, - Tangent, - MutableTangent, - PossiblyUninitTangent, - set_to_zero!!, - tangent_type, - zero_tangent, - _scale, -...
```suggestion export rrule!! ```
Mooncake.jl
github_2023
others
477
compintell
willtebbutt
@@ -142,33 +142,11 @@ include("interface.jl") include("config.jl") include("developer_tools.jl") -export primal, - tangent, - randn_tangent, - increment!!, - NoTangent, - Tangent, - MutableTangent, - PossiblyUninitTangent, - set_to_zero!!, - tangent_type, - zero_tangent, - _scale, -...
Very good point. No, we almost certainly do not.
Mooncake.jl
github_2023
others
477
compintell
willtebbutt
@@ -142,33 +142,11 @@ include("interface.jl") include("config.jl") include("developer_tools.jl") -export primal, - tangent, - randn_tangent, - increment!!, - NoTangent, - Tangent, - MutableTangent, - PossiblyUninitTangent, - set_to_zero!!, - tangent_type, - zero_tangent, - _scale, -...
Why is it that DI needs `zero_tangent`? I know that it need `tangent_type` to check whether the user has provided the correct thing before calling `Mooncake.value_and_pullback!!`, but I'm not sure why we need `zero_tangent`.
Mooncake.jl
github_2023
others
477
compintell
willtebbutt
@@ -0,0 +1,14 @@ +# Interface + +This is the public interface that day-to-day users of AD are expected to interact with if +for some reason DifferentiationInterface.jl does not suffice. +If you have not tried using Mooncake.jl via DifferentiationInterface.jl, please do so. +See [Tutorial](@ref) for more info. + +```@do...
Right you are. I had to apply a filter to handle the unexported bits of the public interface, but I think I've now got it working.
Mooncake.jl
github_2023
others
496
compintell
willtebbutt
@@ -24,6 +24,8 @@ end const MatrixOrView{T} = Union{Matrix{T},SubArray{T,2,<:Array{T}}} const VecOrView{T} = Union{Vector{T},SubArray{T,1,<:Array{T}}} const BlasRealFloat = Union{Float32,Float64} +const BlasComplexFloat = Union{ComplexF32,ComplexF64} +const BlasRealOrComplexFloat = Union{BlasRealFloat,BlasComplexFlo...
I think this is equal to `BLAS.BlasFloat`, so maybe just import that [here](https://github.com/compintell/Mooncake.jl/blob/6c99c65600345800ba07829683cd469494eeed6c/src/Mooncake.jl#L46)?
Mooncake.jl
github_2023
others
496
compintell
willtebbutt
@@ -36,6 +38,11 @@ function arrayify(x::CoDual{A}) where {A<:AbstractArray{<:BlasRealFloat}} return arrayify(primal(x), tangent(x))::Tuple{A,A} end arrayify(x::Array{P}, dx::Array{P}) where {P<:BlasRealFloat} = (x, dx) + +function arrayify(x::CoDual{A}) where {A<:AbstractArray{<:Union{ComplexF32, ComplexF64}}} +...
```suggestion ``` I think it should be fine to just change the element types permitted in the existing method of `arrayify` which accepts `CoDual`s from `BlasRealFloat` to `BlasFloat`.
Mooncake.jl
github_2023
others
496
compintell
willtebbutt
@@ -36,6 +38,11 @@ function arrayify(x::CoDual{A}) where {A<:AbstractArray{<:BlasRealFloat}} return arrayify(primal(x), tangent(x))::Tuple{A,A} end arrayify(x::Array{P}, dx::Array{P}) where {P<:BlasRealFloat} = (x, dx) + +function arrayify(x::CoDual{A}) where {A<:AbstractArray{<:Union{ComplexF32, ComplexF64}}} +...
```suggestion arrayify(x::Array{P}, dx::Array{<:Tangent}) where P<:BlasComplexFloat = x, reinterpret(P, dx) ```
Mooncake.jl
github_2023
others
496
compintell
willtebbutt
@@ -299,6 +306,51 @@ function rrule!!( return y_dy, symv!_adjoint end +@is_primitive( + MinimalCtx, + Tuple{ + typeof(BLAS.nrm2), + Int, + X, + Int, + } where {T<:Union{Float64, Float32, ComplexF64, ComplexF32}, X<:Union{Ptr{T},AbstractArray{T}}},
```suggestion } where {T<:BlasFloat, X<:Union{Ptr{T},AbstractArray{T}}}, ```
Mooncake.jl
github_2023
others
496
compintell
willtebbutt
@@ -299,6 +306,51 @@ function rrule!!( return y_dy, symv!_adjoint end +@is_primitive( + MinimalCtx, + Tuple{ + typeof(BLAS.nrm2), + Int, + X, + Int, + } where {T<:Union{Float64, Float32, ComplexF64, ComplexF32}, X<:Union{Ptr{T},AbstractArray{T}}}, +) +function rrule!!( + ...
```suggestion X_dX::CoDual{<:Union{Ptr{T},AbstractArray{T}} where {T<:BlasFloat}}, ```
Mooncake.jl
github_2023
others
496
compintell
willtebbutt
@@ -299,6 +306,51 @@ function rrule!!( return y_dy, symv!_adjoint end +@is_primitive( + MinimalCtx, + Tuple{ + typeof(BLAS.nrm2), + Int, + X, + Int, + } where {T<:Union{Float64, Float32, ComplexF64, ComplexF32}, X<:Union{Ptr{T},AbstractArray{T}}}, +) +function rrule!!( + ...
```suggestion } where {T<:BlasFloat, X<:Union{Ptr{T},AbstractArray{T}}}, ```
Mooncake.jl
github_2023
others
496
compintell
willtebbutt
@@ -299,6 +306,51 @@ function rrule!!( return y_dy, symv!_adjoint end +@is_primitive( + MinimalCtx, + Tuple{ + typeof(BLAS.nrm2), + Int, + X, + Int, + } where {T<:Union{Float64, Float32, ComplexF64, ComplexF32}, X<:Union{Ptr{T},AbstractArray{T}}}, +) +function rrule!!( + ...
```suggestion X_dX::CoDual{<:Union{Ptr{T},AbstractArray{T}} where {T<:BlasFloat}}, ```
Mooncake.jl
github_2023
others
496
compintell
willtebbutt
@@ -755,7 +807,7 @@ for (trsm, elty) in ((:dtrsm_, :Float64), (:strsm_, :Float32)) end end -function blas_matrices(rng::AbstractRNG, P::Type{<:BlasRealFloat}, p::Int, q::Int) +function blas_matrices(rng::AbstractRNG, P::Type{<:BlasRealOrComplexFloat}, p::Int, q::Int)
```suggestion function blas_matrices(rng::AbstractRNG, P::Type{<:BlasFloat}, p::Int, q::Int) ```
Mooncake.jl
github_2023
others
496
compintell
willtebbutt
@@ -767,7 +819,7 @@ function blas_matrices(rng::AbstractRNG, P::Type{<:BlasRealFloat}, p::Int, q::In return Xs end -function blas_vectors(rng::AbstractRNG, P::Type{<:BlasRealFloat}, p::Int) +function blas_vectors(rng::AbstractRNG, P::Type{<:BlasRealOrComplexFloat}, p::Int)
```suggestion function blas_vectors(rng::AbstractRNG, P::Type{<:BlasFloat}, p::Int) ```
Mooncake.jl
github_2023
others
496
compintell
willtebbutt
@@ -24,6 +24,8 @@ end const MatrixOrView{T} = Union{Matrix{T},SubArray{T,2,<:Array{T}}} const VecOrView{T} = Union{Vector{T},SubArray{T,1,<:Array{T}}} const BlasRealFloat = Union{Float32,Float64} +const BlasComplexFloat = Union{ComplexF32,ComplexF64} +const BlasRealOrComplexFloat = Union{BlasRealFloat,BlasComplexFlo...
```suggestion arrayify(x::CoDual{<:AbstractArray{<:BlasFloat}}) ```
Mooncake.jl
github_2023
others
476
compintell
willtebbutt
@@ -2,15 +2,15 @@ [![Build Status](https://github.com/compintell/Mooncake.jl/actions/workflows/CI.yml/badge.svg?branch=main)](https://github.com/compintell/Mooncake.jl/actions/workflows/CI.yml?query=branch%3Amain) [![codecov](https://codecov.io/github/compintell/Mooncake.jl/graph/badge.svg?token=NUPWTB4IAP)](https:...
```suggestion [![Docs](https://img.shields.io/badge/docs-dev-blue.svg)](https://compintell.github.io/Mooncake.jl/dev) ``` I'd rather not use stable vs dev docs in Mooncake's case, because we should never have anything on main that hasn't been released. Do you think we need to call them dev docs anyway?
Mooncake.jl
github_2023
others
476
compintell
willtebbutt
@@ -0,0 +1,125 @@ +# Tutorial + +There are two ways to compute gradients with Mooncake.jl: + +- through the standardized [DifferentiationInterface.jl](https://github.com/JuliaDiff/DifferentiationInterface.jl) API +- through the native Mooncake.jl API + +We recommend the former to start with, especially if you want to e...
```suggestion You can also use the native API of Mooncake.jl, discussed below. ``` I don't think that it's really more natural to use the Mooncake.jl API -- I'm perfectly happy to really suggest to users that they use DI.jl.
Mooncake.jl
github_2023
others
457
compintell
willtebbutt
@@ -409,36 +410,110 @@ This "vector-Jacobian product" expression is commonly used to explain AD, and is # Directional Derivatives and Gradients -Now we turn to using reverse-mode AD to compute the gradient of a function. -In short, given a function ``g : \mathcal{X} \to \RR`` with derivative ``D g [x]`` at ``x``, ...
I think I'm not quite clear what is meant by `with magnitude equal to the directional derivative in that steepest direction.` -- is there a precise mathematical statement by which you can explain what this means?
Mooncake.jl
github_2023
others
457
compintell
willtebbutt
@@ -409,36 +410,110 @@ This "vector-Jacobian product" expression is commonly used to explain AD, and is # Directional Derivatives and Gradients -Now we turn to using reverse-mode AD to compute the gradient of a function. -In short, given a function ``g : \mathcal{X} \to \RR`` with derivative ``D g [x]`` at ``x``, ...
Is this correct, technically? We make use of the norms for both X and Y in the definition of the Frechet derivative, which I've been assuming we take to be the norms induced by whichever inner products we pick on X and Y. Would it be more accurate to point out that the definition is invariant because all norms in are e...
Mooncake.jl
github_2023
others
457
compintell
willtebbutt
@@ -409,36 +410,110 @@ This "vector-Jacobian product" expression is commonly used to explain AD, and is # Directional Derivatives and Gradients -Now we turn to using reverse-mode AD to compute the gradient of a function. -In short, given a function ``g : \mathcal{X} \to \RR`` with derivative ``D g [x]`` at ``x``, ...
```suggestion In practice, Mooncake uses the Euclidean inner product, extended in the "obvious way" to other composite data types (that is, as if everything is flattened and embedded in ``\mathbb{R}^N``), but we endeavour to keep the discussion general in order to make the role of the inner product explicit. ``` gra...
Mooncake.jl
github_2023
others
457
compintell
willtebbutt
@@ -409,36 +410,110 @@ This "vector-Jacobian product" expression is commonly used to explain AD, and is # Directional Derivatives and Gradients -Now we turn to using reverse-mode AD to compute the gradient of a function. -In short, given a function ``g : \mathcal{X} \to \RR`` with derivative ``D g [x]`` at ``x``, ...
```suggestion where the second equality follows from the gradient's definition. ``` Reading this, I briefly thought that we had multiple definitions of the gradient lying around, and the one you are using here is the "implicit" one, before realising you're just trying to point out that our definition of the gradient...
Mooncake.jl
github_2023
others
457
compintell
willtebbutt
@@ -409,36 +410,110 @@ This "vector-Jacobian product" expression is commonly used to explain AD, and is # Directional Derivatives and Gradients -Now we turn to using reverse-mode AD to compute the gradient of a function. -In short, given a function ``g : \mathcal{X} \to \RR`` with derivative ``D g [x]`` at ``x``, ...
```suggestion The adjoint of the derivative of ``f(x, y) = x + y_1 y_2`` (see [above](#AD-of-a-Julia-function:-a-slightly-less-trivial-example)) immediately gives ``` nit-pick: I don't _believe_ we refer to the "adjoint derivative" anywhere, but we do refer to the "adjoint" and the "adjoint of the derivative" interc...
Mooncake.jl
github_2023
others
457
compintell
willtebbutt
@@ -409,36 +410,110 @@ This "vector-Jacobian product" expression is commonly used to explain AD, and is # Directional Derivatives and Gradients -Now we turn to using reverse-mode AD to compute the gradient of a function. -In short, given a function ``g : \mathcal{X} \to \RR`` with derivative ``D g [x]`` at ``x``, ...
Same here -- "Adjoint Derivatives" vs "Adjoint" or "Adjoint of the Derivative" etc
Mooncake.jl
github_2023
others
457
compintell
willtebbutt
@@ -409,36 +410,110 @@ This "vector-Jacobian product" expression is commonly used to explain AD, and is # Directional Derivatives and Gradients -Now we turn to using reverse-mode AD to compute the gradient of a function. -In short, given a function ``g : \mathcal{X} \to \RR`` with derivative ``D g [x]`` at ``x``, ...
```suggestion For any basis there exists such a reciprocal basis, and they are the same for orthonormal bases such as the standard basis. As a result, you can replace any occurrences of ``\{\mathbf{e}^i\}`` with ``\{\mathbf{e}_i\}`` in what follows and still have a correct understanding of the mathematics underpinning...
Mooncake.jl
github_2023
others
458
compintell
yebai
@@ -0,0 +1,93 @@ +# All of the code here purely exists to work around current performance limitations of +# Mooncake.jl. In order to prevent this from getting out of hand, there are several +# conventions to which we adhere when writing these rules: +# +# 1. for each rule, a comment is added containing a link to the is...
Perhaps ```suggestion dx .+= 2 .* dz .* x.x ```
Mooncake.jl
github_2023
others
469
compintell
gdalle
@@ -202,16 +202,16 @@ end WARNING: experimental functionality. Interface subject to change without warning! -Like other methods of `value_and_pullback!!`, but makes use of the `cache` object in order -to avoid having to re-allocate various tangent objects repeatedly. +Like other methods of `value_and_pullback!!`, ...
Maybe specify "or `deepcopy`"? I think this may come back to bite us more than once. Otherwise good to go, thanks!
Mooncake.jl
github_2023
others
382
compintell
willtebbutt
@@ -0,0 +1,80 @@ +# Compilation process + +The whole rule building is done statically based on types. The first method of `build_rrule` turns argument values into a signature: + +```julia +build_rrule(args...; debug_mode=false) +``` + +The actual action happens in [`s2s_reverse_mode_ad.jl`](https://github.com/compintel...
```suggestion - `ir.argtypes` is the signature. Some are annotated with `Core.Const` to facilitate constant propagation for instance. Other annotations are `PartialStruct`, `Conditional`, `PartialTypeVar`. `Core.Compiler.widenconst` is used to extract types from these. ```
Mooncake.jl
github_2023
others
382
compintell
willtebbutt
@@ -0,0 +1,80 @@ +# Compilation process + +The whole rule building is done statically based on types. The first method of `build_rrule` turns argument values into a signature: + +```julia +build_rrule(args...; debug_mode=false) +``` + +The actual action happens in [`s2s_reverse_mode_ad.jl`](https://github.com/compintel...
```suggestion If we define a rule we should set `is_primitive` to `true` for the corresponding function. ```
Mooncake.jl
github_2023
others
408
compintell
willtebbutt
@@ -168,27 +168,32 @@ Following either resource will yield the derivative: ```math D f [X] (\dot{X}) = \dot{X}^\top X + X^\top \dot{X} ``` -Observe that this is indeed a linear operator (i.e. it is linear in its argument, ``\dot{X}``). -(You can always plug it in to the definition of the Frechet derivative to confir...
Does the matrix `A` exist? Unless I'm mistaken, it's not the case that you can necessarily represent any linear operator on a matrix via multiplication with another matrix. If you treat a matrix `X` of size `P x Q` as a vector of length `PQ`, you can certainly represent any linear transform via a matrix of size `PQ ...
Mooncake.jl
github_2023
others
408
compintell
willtebbutt
@@ -168,27 +168,32 @@ Following either resource will yield the derivative: ```math D f [X] (\dot{X}) = \dot{X}^\top X + X^\top \dot{X} ``` -Observe that this is indeed a linear operator (i.e. it is linear in its argument, ``\dot{X}``). -(You can always plug it in to the definition of the Frechet derivative to confir...
To my previous point regarding the existence of `A`, it's not obvious to me that the jump between these two lines -- substituting `V^T X + X^T V` for `AV` -- can be made sense of. Concretely: does there exist a matrix `A` such that `AV = V^T X + X^T V` for any `X` and `V`?
Mooncake.jl
github_2023
others
408
compintell
willtebbutt
@@ -168,27 +168,32 @@ Following either resource will yield the derivative: ```math D f [X] (\dot{X}) = \dot{X}^\top X + X^\top \dot{X} ``` -Observe that this is indeed a linear operator (i.e. it is linear in its argument, ``\dot{X}``). -(You can always plug it in to the definition of the Frechet derivative to confir...
Unless I'm mistaken, `C` is not used anywhere?
Mooncake.jl
github_2023
others
408
compintell
willtebbutt
@@ -168,27 +168,32 @@ Following either resource will yield the derivative: ```math D f [X] (\dot{X}) = \dot{X}^\top X + X^\top \dot{X} ``` -Observe that this is indeed a linear operator (i.e. it is linear in its argument, ``\dot{X}``). -(You can always plug it in to the definition of the Frechet derivative to confir...
This is defined further up. Perhaps we can remove it?
Mooncake.jl
github_2023
others
408
compintell
willtebbutt
@@ -168,27 +168,32 @@ Following either resource will yield the derivative: ```math D f [X] (\dot{X}) = \dot{X}^\top X + X^\top \dot{X} ``` -Observe that this is indeed a linear operator (i.e. it is linear in its argument, ``\dot{X}``). -(You can always plug it in to the definition of the Frechet derivative to confir...
I'm not sure how to make sense of this line. My understanding is that you're saying we should view `B` as the coordinate form representation of the adjoint of the derivative. So it seems like we have a "type" error -- `D f [X]^\ast (U)` is the matrix resulting from applying the adjoint operator to `U`, but `B` is th...
Mooncake.jl
github_2023
others
408
compintell
willtebbutt
@@ -178,16 +178,17 @@ Using the usual definition of the inner product between matrices, ``` we can rearrange the inner product as follows: ```math -\begin{align} - \langle \bar{Y}, D f [X] (\dot{X}) \rangle &= \langle \bar{Y}, \dot{X}^\top X + X^\top \dot{X} \rangle \nonumber \\ - &= \textrm{tr} (\bar{Y}^\...
```suggestion & =\textrm{tr}(\bar{Y}^{\top}\left(\dot{X}^{\top}X+X^{\top}\dot{X}\right))\\ ```
Mooncake.jl
github_2023
others
408
compintell
willtebbutt
@@ -178,16 +178,17 @@ Using the usual definition of the inner product between matrices, ``` we can rearrange the inner product as follows: ```math -\begin{align} - \langle \bar{Y}, D f [X] (\dot{X}) \rangle &= \langle \bar{Y}, \dot{X}^\top X + X^\top \dot{X} \rangle \nonumber \\ - &= \textrm{tr} (\bar{Y}^\...
```suggestion & =\langle\dot{X},X\bar{Y}^{\top}\rangle+\langle X\bar{Y},\dot{X}\rangle\\ ``` I think the first arg to the second inner product should be transposed, since `<A, B> = tr(A^T B)`
Mooncake.jl
github_2023
others
408
compintell
willtebbutt
@@ -178,16 +178,17 @@ Using the usual definition of the inner product between matrices, ``` we can rearrange the inner product as follows: ```math -\begin{align} - \langle \bar{Y}, D f [X] (\dot{X}) \rangle &= \langle \bar{Y}, \dot{X}^\top X + X^\top \dot{X} \rangle \nonumber \\ - &= \textrm{tr} (\bar{Y}^\...
```suggestion & =\langle X(\bar{Y}^{\top} + \bar{Y}),\dot{X}\rangle. ``` should we simplify to this?
Mooncake.jl
github_2023
others
408
compintell
willtebbutt
@@ -178,16 +178,17 @@ Using the usual definition of the inner product between matrices, ``` we can rearrange the inner product as follows: ```math -\begin{align} - \langle \bar{Y}, D f [X] (\dot{X}) \rangle &= \langle \bar{Y}, \dot{X}^\top X + X^\top \dot{X} \rangle \nonumber \\ - &= \textrm{tr} (\bar{Y}^\...
I think this now has a different typo haha -- the multiplication order of `X` and `\bar{Y}` has been swapped, and I don't believe it should have been. ```suggestion D f [X]^\ast (\bar{Y}) = X (\bar{Y}^{\top} + \bar{Y}). ```
Mooncake.jl
github_2023
others
422
compintell
yebai
@@ -1044,6 +1044,8 @@ function tangent_test_cases() (a=3, b=randn(10)), (a=randn(10), b=randn(10)), (Base.TOML.ErrorType(1), NoTangent()), # Enum + (((((randn(33)...,),),),),),
```suggestion # Regression tests to catch type inference failures, see https://github.com/compintell/Mooncake.jl/pull/422 (((((randn(33)...,),),),),), ```
Mooncake.jl
github_2023
others
421
compintell
willtebbutt
@@ -94,7 +94,7 @@ _simple_mlp(W2, W1, Y, X) = sum(abs2, Y - W2 * map(x -> x * (0 <= x), W1 * X)) _gp_lml(x, y, s) = logpdf(GP(SEKernel())(x, s), y) should_run_benchmark(::Val{:reverse_diff}, ::typeof(_gp_lml), x...) = false -should_run_benchmark(::Val{:enzyme}, ::typeof(_gp_lml), x...) = false +should_run_benchmark...
```suggestion ``` This can just be removed
Mooncake.jl
github_2023
others
390
compintell
willtebbutt
@@ -256,31 +254,31 @@ function increment_and_get_rdata!(f, r, t::CRC.Thunk) return increment_and_get_rdata!(f, r, CRC.unthunk(t)) end -@doc""" - rrule_wrapper(f::CoDual, args::CoDual...) +@doc """ + rrule_wrapper(f::CoDual, args::CoDual...) -Used to implement `rrule!!`s via `ChainRulesCore.rrule`. + Us...
This looks like a bug to me. Does it also look like a bug to you @gdalle ?
Mooncake.jl
github_2023
others
390
compintell
willtebbutt
@@ -3,6 +3,7 @@ include("front_matter.jl") @testset "Mooncake.jl" begin if test_group == "aqua"
We should probably rename this test group to "automated quality assurance", or really anything which highlights that it's not just Aqua.jl anymore.
Mooncake.jl
github_2023
others
320
compintell
ChrisRackauckas
@@ -0,0 +1,20 @@ +module MooncakeDiffEqBaseExt + +using DiffEqBase, Mooncake + +Mooncake.@from_rrule(
I think you'll need the solution adjoints as well. Should this all live with DiffEqBase?
Mooncake.jl
github_2023
others
354
compintell
mhauru
@@ -494,9 +494,42 @@ tangent field. This function uses [`Mooncake.build_rrule`](@ref) to construct a rule. This will use an `rrule!!` if one exists, and derive a rule otherwise. + +# Arguments +- `rng::AbstractRNG`: a random number generator +- `x...`: the function (first element) and its arguments (the remainder) ...
Is the default value the wrong way around here?
Mooncake.jl
github_2023
others
353
compintell
willtebbutt
@@ -0,0 +1,123 @@ +using Pkg +Pkg.activate(@__DIR__) +Pkg.develop(; path = joinpath(@__DIR__, "..", "..", "..")) + +using Bijectors: Bijectors +using LinearAlgebra: LinearAlgebra +using Random: randn + +""" +Type for specifying a test case for `test_rule`. +""" +struct TestCase + func::Function + arg::Any + na...
Why is is that you're summing over the output in all cases?
Mooncake.jl
github_2023
others
301
compintell
sunxd3
@@ -0,0 +1,23 @@ +# Running Tests Locally + +Mooncake.jl's test suite is fairly extensive. While you can use `Pkg.test` to run the test suite in the standard manner, this is not usually optimal in Mooncake.jl. When editing some code, you typically only want to run the tests associated with it, not the entire test suite...
not a suggestion, just a anecdotal point from me: I configure VSCode to automatically do the `Pkg.activate()` for the package folder I open. But what you wrote is better as it assumes less.
Mooncake.jl
github_2023
others
254
compintell
yebai
@@ -1,17 +1,9 @@ module MooncakeDynamicPPLExt -if isdefined(Base, :get_extension) - using DynamicPPL: DynamicPPL, istrans - using Mooncake: Mooncake -else - using ..DynamicPPL: DynamicPPL, istrans - using ..Mooncake: Mooncake -end - -using Mooncake: DefaultCtx, CoDual, simple_zero_adjoint +using DynamicP...
It looks good; can we remove the need for users to pass `Mooncake.DefaultCtx` explicitly, mainly if there is a `DefaultCtx` for most cases? We can still support users in specifying customised contexts if needed.
Mooncake.jl
github_2023
others
254
compintell
yebai
@@ -0,0 +1,66 @@ +module MooncakeNNlibExt + +using NNlib, Random, Mooncake +using Base: IEEEFloat +using NNlib: dropout + +using NNlib: conv, depthwiseconv +import Mooncake: @from_rrule, DefaultCtx, MinimalCtx + +@from_rrule( + MinimalCtx, + Tuple{typeof(batched_mul), Array{P, 3}, Array{P, 3}} where {P<:IEEEFloat...
I am slightly confused about the use of `@eval` in some places but not others. Can you clarify the differences? ```suggestion @from_rrule( ```
Mooncake.jl
github_2023
others
254
compintell
yebai
@@ -0,0 +1,477 @@ +# +# General utilities +# + +function parse_signature_expr(sig::Expr) + # Different parsing is required for `Tuple{...}` vs `Tuple{...} where ...`. + if sig.head == :curly + @assert sig.args[1] == :Tuple + arg_type_symbols = sig.args[2:end] + where_params = nothing + els...
Consider adding a reference to Julia's overlay docs for people who don't know what method overlays are yet (I didn't know until recently seeing them in Reactant).
Mooncake.jl
github_2023
others
254
compintell
yebai
@@ -0,0 +1,477 @@ +# +# General utilities +# + +function parse_signature_expr(sig::Expr) + # Different parsing is required for `Tuple{...}` vs `Tuple{...} where ...`. + if sig.head == :curly + @assert sig.args[1] == :Tuple + arg_type_symbols = sig.args[2:end] + where_params = nothing + els...
As commented above, it is helpful to have a variant of `@zero_adjoint` with a default `ctx`, so users don't need to pass it explicitly unless in exceptional circumstances.
Mooncake.jl
github_2023
others
254
compintell
yebai
@@ -0,0 +1,477 @@ +# +# General utilities +# + +function parse_signature_expr(sig::Expr) + # Different parsing is required for `Tuple{...}` vs `Tuple{...} where ...`. + if sig.head == :curly + @assert sig.args[1] == :Tuple + arg_type_symbols = sig.args[2:end] + where_params = nothing + els...
I find the name `@non_differentiable` more intuitive. Is there any good reason for preferring the name `zero_adjoint` over `non_differentiable`?
Mooncake.jl
github_2023
others
254
compintell
mhauru
@@ -0,0 +1,33 @@ +# Tools for Rules + +Most of the time, Mooncake.jl can just differentiate your code, but you will need to intervene if you make use of a language feature which is unsupported. +However, this does not always necessitate writing your own `rrule!!` from scratch. +In this section, we detail some useful st...
```suggestion There are some instances where it is most convenient to implement a `Mooncake.rrule!!` by wrapping an existing `ChainRulesCore.rrule`. ```
Mooncake.jl
github_2023
others
254
compintell
mhauru
@@ -181,18 +184,31 @@ there is no code found, or if more than one `IRCode` instance returned. Returns a tuple containing the `IRCode` and its return type. """ -function lookup_ir(interp::CC.AbstractInterpreter, sig::Type{<:Tuple}) - output = Base.code_ircode_by_type(sig; interp) - if isempty(output) - ...
The docstring says "Get the IR unique IR associated", probably a typo.
Mooncake.jl
github_2023
others
254
compintell
mhauru
@@ -0,0 +1,477 @@ +# +# General utilities +# + +function parse_signature_expr(sig::Expr) + # Different parsing is required for `Tuple{...}` vs `Tuple{...} where ...`. + if sig.head == :curly + @assert sig.args[1] == :Tuple + arg_type_symbols = sig.args[2:end] + where_params = nothing + els...
Will it error in some understandable way?
Mooncake.jl
github_2023
others
254
compintell
mhauru
@@ -0,0 +1,477 @@ +# +# General utilities +# + +function parse_signature_expr(sig::Expr) + # Different parsing is required for `Tuple{...}` vs `Tuple{...} where ...`. + if sig.head == :curly + @assert sig.args[1] == :Tuple + arg_type_symbols = sig.args[2:end] + where_params = nothing + els...
I don't really know what this does, but it jumps at me as a call to `!!` that ignores the return value. If `increment!!` is guaranteed to mutate rather than create a new instance, could it be called `increment!`? If not, might this fail to do what is expected of it?
Mooncake.jl
github_2023
others
254
compintell
mhauru
@@ -0,0 +1,477 @@ +# +# General utilities +# + +function parse_signature_expr(sig::Expr) + # Different parsing is required for `Tuple{...}` vs `Tuple{...} where ...`. + if sig.head == :curly + @assert sig.args[1] == :Tuple + arg_type_symbols = sig.args[2:end] + where_params = nothing + els...
```suggestion Given a function `foo`, argument types `arg_types`, and a method of `ChainRulesCore.rrule` ```
Mooncake.jl
github_2023
others
254
compintell
mhauru
@@ -0,0 +1,136 @@ +overlay_tester(x) = 2x +Mooncake.@mooncake_overlay overlay_tester(x) = 3x + +zero_tester(x) = 0 +Mooncake.@zero_adjoint MinimalCtx Tuple{typeof(zero_tester), Float64} + +vararg_zero_tester(x...) = 0 +Mooncake.@zero_adjoint MinimalCtx Tuple{typeof(vararg_zero_tester), Vararg} +
Is there a reason to have these outside the module?
Mooncake.jl
github_2023
others
281
compintell
willtebbutt
@@ -59,8 +59,12 @@ function logdensity_and_gradient(∇l::MooncakeGradientLogDensity, x::Vector{Flo end # Interop with ADTypes. +function getconfig(x::ADTypes.AutoMooncake) + c = x.config + isnothing(c) ? Mooncake.DEFAULT_CONFIG : c
```suggestion return isnothing(c) ? Mooncake.DEFAULT_CONFIG : c ``` formatting
Mooncake.jl
github_2023
others
228
compintell
willtebbutt
@@ -474,45 +448,36 @@ end stackdict[x] = zt return _map_if_assigned!(Base.Fix2(zero_tangent_internal, stackdict), zt, x)::Array{tangent_type(P), N} end -@inline function zero_tangent_internal(x::P, stackdict::IdDict) where {P<:Union{Tuple, NamedTuple}} - return tangent_type(P) == NoTangent ? NoTangent() :...
Would you mind fleshing out the error message here. This should probably be `throw(ArgumentError("informative message"))`, rather than just a plain error.
Mooncake.jl
github_2023
others
228
compintell
willtebbutt
@@ -474,45 +448,36 @@ end stackdict[x] = zt return _map_if_assigned!(Base.Fix2(zero_tangent_internal, stackdict), zt, x)::Array{tangent_type(P), N} end -@inline function zero_tangent_internal(x::P, stackdict::IdDict) where {P<:Union{Tuple, NamedTuple}} - return tangent_type(P) == NoTangent ? NoTangent() :...
You've gone for `Union{IdDict, Nothing}` here as the restriction on the type of `stackdict`, whereas in other places `Any` has been used. Can we use `Any` here as well, or is there a good reason not to that I'm missing?
Mooncake.jl
github_2023
others
228
compintell
willtebbutt
@@ -529,47 +494,56 @@ details -- this docstring is intentionally non-specific in order to avoid becomi Required for testing. Generate a randomly-chosen tangent to `x`. +The design is closely modelled after `zero_tangent`. """ -randn_tangent(::AbstractRNG, ::NoTangent) = NoTangent() -randn_tangent(rng::AbstractRNG,...
Same thing here re type constraints.
Mooncake.jl
github_2023
others
228
compintell
willtebbutt
@@ -163,51 +163,46 @@ end end -# TODO: ideally we want to add the following test to the above testset (defined src/tangent.jl) -# but we have to delay this until `randn_tangent` is implemented and working. -@testset "zero_tangent" begin +# TODO: add the following test to `tangent_test_cases` +@testset "zero_tan...
Formatting, maybe consider something like ```suggestion @test ==( Tapir.zero_tangent(bar), Tangent{@NamedTuple{a::PossiblyUninitTangent{Float64}, b::PossiblyUninitTangent{Any}}}( (a=PossiblyUninitTangent{Float64}(0.0), b=PossiblyUninitTangent{Any}(0.0)) ) ) ``` I really need to set up forma...
Mooncake.jl
github_2023
others
242
compintell
yebai
@@ -0,0 +1,19 @@ +module TapirDynamicPPLExt + +if isdefined(Base, :get_extension) + using DynamicPPL: DynamicPPL + using Tapir: Tapir +else + using ..DynamicPPL: DynamicPPL + using ..Tapir: Tapir +end + +using Tapir: DefaultCtx, CoDual, NoPullback, primal, zero_fcodual + +# This is purely an optimisation. +...
Can we have a macro / function to automate the process of specifying these skipping gradients rules? IIRC, ReverseDiff has SkipOptimise which does similar things.
Mooncake.jl
github_2023
others
220
compintell
willtebbutt
@@ -118,7 +115,7 @@ We then generalise. _**Reverse-Mode AD: what does it do in Euclidean space?**_ In this setting, the goal of reverse-mode AD is the following: given a function ``f : \RR^P \to \RR^Q`` which is differentiable at ``x \in \RR^P`` with Jacobian ``J[x]`` at ``x``, compute ``J[x]^\top \bar{y}`` for any...
Could we stick with the original wording here?
Mooncake.jl
github_2023
others
220
compintell
willtebbutt
@@ -259,11 +256,9 @@ g -> (g, (y[2] * g, y[1] * g)) As before, we work through in detail. - - _**Step 1: Differentiable Mathematical Model**_ -There are a couple of aspects of `f` which require thought: +There are a couple of aspects of `f` which require thoughts:
```suggestion There are a couple of aspects of `f` which require thought: ``` I'm pretty sure the original wording is correct (grammatically speaking)
Mooncake.jl
github_2023
others
220
compintell
willtebbutt
@@ -173,7 +173,8 @@ Consider the usual inner product to derive the adjoint: ```math \begin{align} \langle \bar{y}, D f [x] (\dot{x}) \rangle &= \langle (\bar{y}_1, \bar{y}_2), (\dot{x}, D \varphi [x](\dot{x})) \rangle \nonumber \\ - &= \langle \bar{y}_1, \dot{x} \rangle + \langle D \varphi [x]^\ast (\bar{...
This is great -- much clearer now
Mooncake.jl
github_2023
others
220
compintell
willtebbutt
@@ -340,6 +341,7 @@ where ``\mathbf{1}`` is the vector of length ``N`` in which each element is equa (Observe that this agrees with the result we derived earlier for functions which don't mutate their arguments). Now that we know what the adjoint is, we'll write down the `rrule!!`, and then explain what is going on...
I like this. I think I'd like to re-word very slightly to: ```suggestion This hand-written implementation is to aid your understanding -- Tapir.jl should be relied upon to generate this code automatically in practice. ```