Giter VIP home page Giter VIP logo

Comments (5)

dchang10 avatar dchang10 commented on September 24, 2024 1

I seem to have made some change and am now unable to reproduce the error. I guess I will be closing this issue since I can't figure out what caused the error in the first place.

from metal.jl.

maleadt avatar maleadt commented on September 24, 2024

Can you post the actual error message? It's unclear what part of the toolchain is failing here.
Also, do you have an MWE?

from metal.jl.

dchang10 avatar dchang10 commented on September 24, 2024

I get this error message:

ERROR: Compilation to native code failed; see below for details.
If you think this is a bug, please file an issue and attach /var/folders/69/qsgfh7hj7csg_x4f7kcm2p9r0000gq/T/jl_JwhY1QBopO.metallib.
Stacktrace:
  [1] error(s::String)
    @ Base ./error.jl:35
  [2] link(job::GPUCompiler.CompilerJob, compiled::NamedTuple{(:image, :entry), Tuple{Vector{UInt8}, String}}; return_function::Bool)
    @ Metal ~/.julia/packages/Metal/OchAS/src/compiler/compilation.jl:78
  [3] link(job::GPUCompiler.CompilerJob, compiled::NamedTuple{(:image, :entry), Tuple{Vector{UInt8}, String}})
    @ Metal ~/.julia/packages/Metal/OchAS/src/compiler/compilation.jl:65
  [4] actual_compilation(cache::Dict{Any, Any}, src::Core.MethodInstance, world::UInt64, cfg::GPUCompiler.CompilerConfig{GPUCompiler.MetalCompilerTarget, Metal.MetalCompilerParams}, compiler::typeof(Metal.compile), linker::typeof(Metal.link))
    @ GPUCompiler ~/.julia/packages/GPUCompiler/U36Ed/src/execution.jl:132
  [5] cached_compilation(cache::Dict{Any, Any}, src::Core.MethodInstance, cfg::GPUCompiler.CompilerConfig{GPUCompiler.MetalCompilerTarget, Metal.MetalCompilerParams}, compiler::Function, linker::Function)
    @ GPUCompiler ~/.julia/packages/GPUCompiler/U36Ed/src/execution.jl:103
  [6] macro expansion
    @ ~/.julia/packages/Metal/OchAS/src/compiler/execution.jl:185 [inlined]
  [7] macro expansion
    @ ./lock.jl:267 [inlined]
  [8] mtlfunction(f::GPUArrays.var"#broadcast_kernel#38", tt::Type{Tuple{Metal.mtlKernelContext, MtlDeviceMatrix{Float32, 1}, Base.Broadcast.Broadcasted{Metal.MtlArrayStyle{2, Metal.MTL.MTLResourceStorageModePrivate}, Tuple{Base.OneTo{Int64}, Base.OneTo{Int64}}, Krang.ElectronSynchrotronPowerLawIntensity, Tuple{Base.Broadcast.Extruded{MtlDeviceMatrix{Krang.IntensityPixel{Float32}, 1}, Tuple{Bool, Bool}, Tuple{Int64, Int64}}, Metal.MtlRefValue{Krang.UnionGeometry{Krang.ConeGeometry{Float32, Tuple{StaticArraysCore.SVector{3, Float32}, StaticArraysCore.SVector{3, Float32}, Tuple{Int64, Int64, Int64}, typeof(profile), Float32, Float32}}, Krang.ConeGeometry{Float32, Tuple{StaticArraysCore.SVector{3, Float32}, StaticArraysCore.SVector{3, Float32}, Tuple{Int64, Int64, Int64}, typeof(profile), Float32, Float32}}}}}}, Int64}}; name::Nothing, kwargs::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
    @ Metal ~/.julia/packages/Metal/OchAS/src/compiler/execution.jl:180
  [9] mtlfunction
    @ ~/.julia/packages/Metal/OchAS/src/compiler/execution.jl:178 [inlined]
 [10] macro expansion
    @ ~/.julia/packages/Metal/OchAS/src/compiler/execution.jl:85 [inlined]
 [11] #launch_heuristic#96
    @ ~/.julia/packages/Metal/OchAS/src/gpuarrays.jl:14 [inlined]
 [12] launch_heuristic
    @ ~/.julia/packages/Metal/OchAS/src/gpuarrays.jl:12 [inlined]
 [13] _copyto!
    @ ~/.julia/packages/GPUArrays/Hd5Sk/src/host/broadcast.jl:56 [inlined]
 [14] copyto!
    @ ~/.julia/packages/GPUArrays/Hd5Sk/src/host/broadcast.jl:37 [inlined]
 [15] copy
    @ ~/.julia/packages/GPUArrays/Hd5Sk/src/host/broadcast.jl:28 [inlined]
 [16] materialize(bc::Base.Broadcast.Broadcasted{Metal.MtlArrayStyle{2, Metal.MTL.MTLResourceStorageModePrivate}, Nothing, Krang.ElectronSynchrotronPowerLawIntensity, Tuple{MtlMatrix{Krang.IntensityPixel{Float32}, Metal.MTL.MTLResourceStorageModePrivate}, Base.RefValue{Krang.UnionGeometry{Krang.ConeGeometry{Float32, Tuple{StaticArraysCore.SVector{3, Float32}, StaticArraysCore.SVector{3, Float32}, Tuple{Int64, Int64, Int64}, typeof(profile), Float32, Float32}}, Krang.ConeGeometry{Float32, Tuple{StaticArraysCore.SVector{3, Float32}, StaticArraysCore.SVector{3, Float32}, Tuple{Int64, Int64, Int64}, typeof(profile), Float32, Float32}}}}}})
    @ Base.Broadcast ./broadcast.jl:873
 [17] macro expansion
    @ ~/.julia/packages/Metal/OchAS/src/utilities.jl:10 [inlined]
 [18] top-level scope
    @ ~/.julia/packages/Metal/OchAS/src/pool.jl:175 [inlined]
 [19] top-level scope
    @ ~/Software/Krang.jl/examples/gpuexample.jl:0

caused by: NSError: Undefined symbols:
  _julia_asn_3237, referenced from: _Z16broadcast_kernel16mtlKernelContext14MtlDeviceArrayI7Float32Li2ELi1EE11BroadcastedI13MtlArrayStyleILi2E39Metal_MTL_MTLResourceStorageModePrivateE5TupleI5OneToI5Int64ES5_IS6_EE36ElectronSynchrotronPowerLawIntensityS4_I8ExtrudedIS0_I14IntensityPixelIS1_ELi2ELi1EES4_I4BoolS10_ES4_IS6_S6_EE11MtlRefValueI13UnionGeometryI12ConeGeometryIS1_S4_I6SArrayIS4_ILi3EES1_Li1ELi3EES14_IS4_ILi3EES1_Li1ELi3EES4_IS6_S6_S6_E7profileS1_S1_EES13_IS1_S4_IS14_IS4_ILi3EES1_Li1ELi3EES14_IS4_ILi3EES1_Li1ELi3EES4_IS6_S6_S6_ES15_S1_S1_EEEEEES6_
  _julia_asn_3237, referenced from: _Z16broadcast_kernel16mtlKernelContext14MtlDeviceArrayI7Float32Li2ELi1EE11BroadcastedI13MtlArrayStyleILi2E39Metal_MTL_MTLResourceStorageModePrivateE5TupleI5OneToI5Int64ES5_IS6_EE36ElectronSynchrotronPowerLawIntensityS4_I8ExtrudedIS0_I14IntensityPixelIS1_ELi2ELi1EES4_I4BoolS10_ES4_IS6_S6_EE11MtlRefValueI13UnionGeometryI12ConeGeometryIS1_S4_I6SArrayIS4_ILi3EES1_Li1ELi3EES14_IS4_ILi3EES1_Li1ELi3EES4_IS6_S6_S6_E7profileS1_S1_EES13_IS1_S4_IS14_IS4_ILi3EES1_Li1ELi3EES14_IS4_ILi3EES1_Li1ELi3EES4_IS6_S6_S6_ES15_S1_S1_EEEEEES6_
 (AGXMetalG13X, code 2)
Stacktrace:
  [1] MTLComputePipelineState(dev::Metal.MTL.MTLDeviceInstance, fun::Metal.MTL.MTLFunctionInstance)
    @ Metal.MTL ~/.julia/packages/Metal/OchAS/lib/mtl/compute_pipeline.jl:60
  [2] link(job::GPUCompiler.CompilerJob, compiled::NamedTuple{(:image, :entry), Tuple{Vector{UInt8}, String}}; return_function::Bool)
    @ Metal ~/.julia/packages/Metal/OchAS/src/compiler/compilation.jl:70
  [3] link(job::GPUCompiler.CompilerJob, compiled::NamedTuple{(:image, :entry), Tuple{Vector{UInt8}, String}})
    @ Metal ~/.julia/packages/Metal/OchAS/src/compiler/compilation.jl:65
  [4] actual_compilation(cache::Dict{Any, Any}, src::Core.MethodInstance, world::UInt64, cfg::GPUCompiler.CompilerConfig{GPUCompiler.MetalCompilerTarget, Metal.MetalCompilerParams}, compiler::typeof(Metal.compile), linker::typeof(Metal.link))
    @ GPUCompiler ~/.julia/packages/GPUCompiler/U36Ed/src/execution.jl:132
  [5] cached_compilation(cache::Dict{Any, Any}, src::Core.MethodInstance, cfg::GPUCompiler.CompilerConfig{GPUCompiler.MetalCompilerTarget, Metal.MetalCompilerParams}, compiler::Function, linker::Function)
    @ GPUCompiler ~/.julia/packages/GPUCompiler/U36Ed/src/execution.jl:103
  [6] macro expansion
    @ ~/.julia/packages/Metal/OchAS/src/compiler/execution.jl:185 [inlined]
  [7] macro expansion
    @ ./lock.jl:267 [inlined]
  [8] mtlfunction(f::GPUArrays.var"#broadcast_kernel#38", tt::Type{Tuple{Metal.mtlKernelContext, MtlDeviceMatrix{Float32, 1}, Base.Broadcast.Broadcasted{Metal.MtlArrayStyle{2, Metal.MTL.MTLResourceStorageModePrivate}, Tuple{Base.OneTo{Int64}, Base.OneTo{Int64}}, Krang.ElectronSynchrotronPowerLawIntensity, Tuple{Base.Broadcast.Extruded{MtlDeviceMatrix{Krang.IntensityPixel{Float32}, 1}, Tuple{Bool, Bool}, Tuple{Int64, Int64}}, Metal.MtlRefValue{Krang.UnionGeometry{Krang.ConeGeometry{Float32, Tuple{StaticArraysCore.SVector{3, Float32}, StaticArraysCore.SVector{3, Float32}, Tuple{Int64, Int64, Int64}, typeof(profile), Float32, Float32}}, Krang.ConeGeometry{Float32, Tuple{StaticArraysCore.SVector{3, Float32}, StaticArraysCore.SVector{3, Float32}, Tuple{Int64, Int64, Int64}, typeof(profile), Float32, Float32}}}}}}, Int64}}; name::Nothing, kwargs::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
    @ Metal ~/.julia/packages/Metal/OchAS/src/compiler/execution.jl:180
  [9] mtlfunction
    @ ~/.julia/packages/Metal/OchAS/src/compiler/execution.jl:178 [inlined]
 [10] macro expansion
    @ ~/.julia/packages/Metal/OchAS/src/compiler/execution.jl:85 [inlined]
 [11] #launch_heuristic#96
    @ ~/.julia/packages/Metal/OchAS/src/gpuarrays.jl:14 [inlined]
 [12] launch_heuristic
    @ ~/.julia/packages/Metal/OchAS/src/gpuarrays.jl:12 [inlined]
 [13] _copyto!
    @ ~/.julia/packages/GPUArrays/Hd5Sk/src/host/broadcast.jl:56 [inlined]
 [14] copyto!
    @ ~/.julia/packages/GPUArrays/Hd5Sk/src/host/broadcast.jl:37 [inlined]
 [15] copy
    @ ~/.julia/packages/GPUArrays/Hd5Sk/src/host/broadcast.jl:28 [inlined]
 [16] materialize(bc::Base.Broadcast.Broadcasted{Metal.MtlArrayStyle{2, Metal.MTL.MTLResourceStorageModePrivate}, Nothing, Krang.ElectronSynchrotronPowerLawIntensity, Tuple{MtlMatrix{Krang.IntensityPixel{Float32}, Metal.MTL.MTLResourceStorageModePrivate}, Base.RefValue{Krang.UnionGeometry{Krang.ConeGeometry{Float32, Tuple{StaticArraysCore.SVector{3, Float32}, StaticArraysCore.SVector{3, Float32}, Tuple{Int64, Int64, Int64}, typeof(profile), Float32, Float32}}, Krang.ConeGeometry{Float32, Tuple{StaticArraysCore.SVector{3, Float32}, StaticArraysCore.SVector{3, Float32}, Tuple{Int64, Int64, Int64}, typeof(profile), Float32, Float32}}}}}})
    @ Base.Broadcast ./broadcast.jl:873
 [17] macro expansion
    @ ~/.julia/packages/Metal/OchAS/src/utilities.jl:10 [inlined]
 [18] top-level scope
    @ ~/.julia/packages/Metal/OchAS/src/pool.jl:175 [inlined]
 [19] top-level scope
    @ ~/Software/Krang.jl/examples/gpuexample.jl:0

I'll work on seeing if I can produce a MWE

from metal.jl.

dchang10 avatar dchang10 commented on September 24, 2024

I think I found the heart of the error. It arrises whenever an array is constructed to be looped over in the function definition. So this causes an error:

arr = MtlArray(zeros(Float32, sze, sze))
function test(pix)
    ans = 0f0
    for n in [0,1,2]
        ans += 1f0
    end
    return sum
end
test.(arr)

while this does not,

arr = MtlArray(zeros(Float32, sze, sze))
function test(pix)
    ans = 0f0
    for n in 0:2
        ans += 1f0
    end
    return sum
end
test.(arr)

from metal.jl.

maleadt avatar maleadt commented on September 24, 2024

I think I found the heart of the error. It arrises whenever an array is constructed to be looped over in the function definition. So this causes an error:

for n in [0,1,2]

You're allocating a CPU array in there, which is unsupported, as the error message tells you:

julia> test.(arr)
ERROR: InvalidIRError: compiling MethodInstance for (::GPUArrays.var"#broadcast_kernel#38")(::Metal.mtlKernelContext, ::MtlDeviceMatrix{typeof(sum), 1}, ::Base.Broadcast.Broadcasted{Metal.MtlArrayStyle{2, Metal.MTL.MTLResourceStorageModePrivate}, Tuple{Base.OneTo{Int64}, Base.OneTo{Int64}}, typeof(test), Tuple{Base.Broadcast.Extruded{MtlDeviceMatrix{Float32, 1}, Tuple{Bool, Bool}, Tuple{Int64, Int64}}}}, ::Int64) resulted in invalid LLVM IR
Reason: unsupported call through a literal pointer (call to ijl_alloc_array_1d)
Stacktrace:
  [1] Array
    @ ./boot.jl:477
  [2] Array
    @ ./boot.jl:486
  [3] similar
    @ ./abstractarray.jl:884
  [4] similar
    @ ./abstractarray.jl:883
  [5] _array_for
    @ ./array.jl:671
  [6] _array_for
    @ ./array.jl:674
  [7] vect
    @ ./array.jl:126
  [8] test
    @ ./REPL[5]:3
  [9] _broadcast_getindex_evalf
    @ ./broadcast.jl:683
 [10] _broadcast_getindex
    @ ./broadcast.jl:656
 [11] getindex
    @ ./broadcast.jl:610
 [12] broadcast_kernel
    @ ~/.julia/packages/GPUArrays/EoKy0/src/host/broadcast.jl:59

Are you sure you've correctly reduced the error? The Compilation to native code failed issue you originally reported is something we need to fix, but the InvalidIRError your MWE throws is your problem.

Also please report the output of Metal.versioninfo()

from metal.jl.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.