google / swift Goto Github PK
View Code? Open in Web Editor NEWThis project forked from swiftlang/swift
The Swift Programming Language
Home Page: https://swift.org/
This project forked from swiftlang/swift
The Swift Programming Language
Home Page: https://swift.org/
Continuing our discussion from the group here.
Full background - I've just copied my comment directly from the group:
I've had some success in using third-party SPM packages by creating a dynamic library and linking to it when launching the REPL, however, it seems like the import order of TensorFlow vs other packages is important; importing the 3rd-party lib first causes a C++ runtime error in TensorFlow.
Here's some snippets:
Package.swift
import PackageDescription
let package = Package(
name: "TFExample",
products: [
.library(
name: "TFExample",
type: .dynamic, // allow use of this package and it's deps from the REPL
targets: ["TFExample"]
)
],
dependencies: [
.package(url: "https://github.com/ReactiveX/RxSwift.git", "4.0.0" ..< "5.0.0")
],
targets: [
.target(
name: "TFExample",
dependencies: ["RxSwift"]),
.testTarget(
name: "TFExampleTests",
dependencies: ["TFExample"]),
]
)
... then we just fetch dependencies and build with vanilla commands, then invoke the REPL:
Invocation
swift -I/usr/lib/swift/clang/include -I/usr/src/TFExample/.build/debug -L/usr/src/TFExample/.build/debug -lTFExample
At this point, I'm able to import RxSwift and TensorFlow in the REPL without errors in any order; however, when I actually interact with the packages, the incorrect import order does result in a runtime error:
Scenario 1 (OK)
1> import TensorFlow
2> import RxSwift
3> _ = Observable.from([1,2]).subscribe(onNext: { print($0) })
1
2
4> var x = Tensor([[1, 2], [3, 4]])
2018-04-27 17:13:12.514107: I tensorflow/core/platform/cpu_feature_guard.cc:140] Your CPU supports instructions that this TensorFlow binary was not compiled to use: SSE4.1 SSE4.2 AVX AVX2 FMA
x: TensorFlow.Tensor<Double> = [[1.0, 2.0], [3.0, 4.0]]
Scenario 2 (runtime error)
1> import RxSwift
2> import TensorFlow
3> _ = Observable.from([1,2]).subscribe(onNext: { print($0) })
1
2
4> var x = Tensor([[1, 2], [3, 4]])
x: TensorFlow.Tensor<Double> =terminate called after throwing an instance of 'std::logic_error'
what(): basic_string::_M_construct null not valid
The full process is outlined here if more detail is necessary: https://github.com/zachgrayio/swift-tensorflow/blob/example/package/README.md#run-with-dependencies-advanced
Test case:
// This testcase exposed bb argument and source location manipulation problems.
public func testEagerLoop() -> Int32 {
var a = Tensor<Int32>(6)
// expected-error @+2 {{GraphGen cannot lower a 'receive' from the host yet}}
var count = Tensor<Int32>(0) // expected-warning {{value implicitly copied to the host}}
while a.elementsEqual(1).scalar! { // expected-warning 2 {{implicitly copied}} expected-note {{value used here}}
if (a % 2).elementsEqual(0).scalar! { // expected-warning 2 {{implicitly copied}} expected-note {{value used here}}
a = a / 2
} else {
a = 3 * a + 1
}
count += 1
}
return count.scalar! // expected-note {{value used here}}
}
F0503 11:47:05.977181 43806 logging.cc:78] assert.h assertion failed at third_party/unsupported_toolchains/swift/src/swift/lib/SILOptimizer/Mandatory/TFLowerGraph.cpp:455 in TF_Output (anonymous namespace)::TFGraphLowering::getOperandValue((anonymous namespace)::SILOpResult): valueInfo.first.oper != nullptr && "didn't find live-in value?"
This occurs when building the TensorFlow stdlib module when -enable-sil-ownership
is on. We had to turn off SIL ownership because of this.
1. While emitting SIL for 'enableTPU(infeed:)' at /Users/rxwei/Development/swift-source/swift/stdlib/public/TensorFlow/CompilerRuntime.swift:47:10
2. While silgen emitFunction SIL function "@$S10TensorFlowAAO9enableTPU6infeedySb_tFZ".
for 'enableTPU(infeed:)' at /Users/rxwei/Development/swift-source/swift/stdlib/public/TensorFlow/CompilerRuntime.swift:47:10
3. While verifying SIL function "@$S10TensorFlowAAO9enableTPU6infeedySb_tFZ".
for 'enableTPU(infeed:)' at /Users/rxwei/Development/swift-source/swift/stdlib/public/TensorFlow/CompilerRuntime.swift:47:10
[71/358] Generating /Users/rxwei/Development/swift-source/build/bui...wift-macosx-x86_64/./lib/swift/macosx/x86_64/Foundation.swiftmodule
/Users/rxwei/Development/swift-source/swift/stdlib/public/SDK/Foundation/NSRange.swift:200:21: warning: 'CustomPlaygroundQuickLookable' is deprecated: CustomPlaygroundQuickLookable will be removed in a future Swift version. For customizing how types are presented in playgrounds, use CustomPlaygroundDisplayConvertible instead.
^
Function: '$S10TensorFlowAAO9enableTPU6infeedySb_tFZ'
Error! Found a leaked owned value that was never consumed.
Value: %9 = builtin "__tfop_tfc.configureTPU,enableInfeed"(%0 : $Bool) : $()
triggering standard assertion failure routine
UNREACHABLE executed at /Users/rxwei/Development/swift-source/swift/lib/SIL/SILOwnershipVerifier.cpp:1619!
0 swift 0x0000000111b79ae8 llvm::sys::PrintStackTrace(llvm::raw_ostream&) + 40
1 swift 0x0000000111b78a36 llvm::sys::RunSignalHandlers() + 86
2 swift 0x0000000111b7a0ae SignalHandler(int) + 366
3 libsystem_platform.dylib 0x00007fff77dfcf5a _sigtramp + 26
4 swift 0x00000001128b3b18 (anonymous namespace)::DarwinX86AsmBackend::getCompactUnwindRegNum(unsigned int) const::CU64BitRegs + 238062
5 libsystem_c.dylib 0x00007fff77c27312 abort + 127
6 swift 0x0000000111b17540 LLVMInstallFatalErrorHandler + 0
7 swift 0x000000010f167505 (anonymous namespace)::SILValueOwnershipChecker::handleError(llvm::function_ref<void ()>&&) const + 165
8 swift 0x000000010f15c379 (anonymous namespace)::SILValueOwnershipChecker::check() + 2697
9 swift 0x000000010f15b62f swift::SILValue::verifyOwnership(swift::SILModule&, swift::DeadEndBlocks*) const + 591
10 swift 0x000000010f1510ab (anonymous namespace)::SILVerifier::visitSILInstruction(swift::SILInstruction*) + 3051
11 swift 0x000000010f140b02 swift::SILInstructionVisitor<(anonymous namespace)::SILVerifier, void>::visit(swift::SILInstruction*) + 10418
12 swift 0x000000010f13c43c (anonymous namespace)::SILVerifier::visitSILBasicBlock(swift::SILBasicBlock*) + 1212
13 swift 0x000000010f138797 swift::SILFunction::verify(bool) const + 8631
14 swift 0x000000010eb1b1ed swift::Lowering::SILGenModule::postEmitFunction(swift::SILDeclRef, swift::SILFunction*) + 205
15 swift 0x000000010eb23777 swift::Lowering::SILGenModule::emitFunction(swift::FuncDecl*)::$_2::operator()(swift::SILFunction*) const + 327
16 swift 0x000000010eb1ac2d swift::Lowering::SILGenModule::emitFunction(swift::FuncDecl*) + 733
17 swift 0x000000010ebf7909 (anonymous namespace)::SILGenType::emitType() + 233
18 swift 0x000000010ebf7819 swift::Lowering::SILGenModule::visitNominalTypeDecl(swift::NominalTypeDecl*) + 25
19 swift 0x000000010eb2042b swift::Lowering::SILGenModule::emitSourceFile(swift::SourceFile*, unsigned int) + 763
20 swift 0x000000010eb21125 swift::SILModule::constructSIL(swift::ModuleDecl*, swift::SILOptions&, swift::FileUnit*, llvm::Optional<unsigned int>, bool) + 437
21 swift 0x000000010eb2163c swift::performSILGeneration(swift::ModuleDecl*, swift::SILOptions&, bool) + 28
22 swift 0x000000010e422898 performCompile(swift::CompilerInstance&, swift::CompilerInvocation&, llvm::ArrayRef<char const*>, int&, swift::FrontendObserver*, swift::UnifiedStatsReporter*) + 14312
23 swift 0x000000010e41e135 swift::performFrontend(llvm::ArrayRef<char const*>, char const*, void*, swift::FrontendObserver*) + 2965
24 swift 0x000000010e3d6099 main + 2249
25 libdyld.dylib 0x00007fff77b7b115 start + 1
Stack dump:
I'm not sure this bug is related to google/swift or tensorflow/swift.
This document say you can use python api from Swift code.
So I installed pre-build package and run following code, but this code dose not work.
import Python
let np = Python.import("numpy")
let ary = np.array([1,2,3])
print(ary)
// ERROR:
// main.swift:3:19: error: cannot call value of non-function type 'PyValue'
// let ary = np.array([1,2,3])
Instead following code is works.
import Python
let np = Python.import("numpy")
let ary = np.array.call(with:[1,2,3])
print(ary)
//=> [1 2 3]
Is this documentation bug or spec ?
Environment:
With a typical vanilla Cocoa app with the name TFImageClassifier, I ventured to copy and paste portions of the existing MNIST example code into my Cocoa app without adding any interface sugar coating but I am getting the following error in Xcode 9.3:
CompileSwift normal x86_64 /Users/testuser/Desktop/TFImageClassifier/TFImageClassifier/ViewController.swift
cd /Users/testuser/Desktop/TFImageClassifier
/Library/Developer/Toolchains/swift-tensorflow-DEVELOPMENT-2018-04-26-a.xctoolchain/usr/bin/swift -frontend -c -primary-file /Users/testuser/Desktop/TFImageClassifier/TFImageClassifier/ViewController.swift /Users/testuser/Desktop/TFImageClassifier/TFImageClassifier/AppDelegate.swift -target x86_64-apple-macosx10.13 -enable-objc-interop -sdk /Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX10.13.sdk -I /Users/testuser/Library/Developer/Xcode/DerivedData/TFImageClassifier-dhgxpyvnqbzvumbysistfzrsimkg/Build/Products/Debug -F /Users/testuser/Library/Developer/Xcode/DerivedData/TFImageClassifier-dhgxpyvnqbzvumbysistfzrsimkg/Build/Products/Debug -enable-testing -g -module-cache-path /Users/testuser/Library/Developer/Xcode/DerivedData/ModuleCache.noindex -swift-version 4 -enforce-exclusivity=checked -Onone -D DEBUG -serialize-debugging-options -Xcc -I/Users/testuser/Library/Developer/Xcode/DerivedData/TFImageClassifier-dhgxpyvnqbzvumbysistfzrsimkg/Build/Intermediates.noindex/TFImageClassifier.build/Debug/TFImageClassifier.build/swift-overrides.hmap -Xcc -iquote -Xcc /Users/testuser/Library/Developer/Xcode/DerivedData/TFImageClassifier-dhgxpyvnqbzvumbysistfzrsimkg/Build/Intermediates.noindex/TFImageClassifier.build/Debug/TFImageClassifier.build/TFImageClassifier-generated-files.hmap -Xcc -I/Users/testuser/Library/Developer/Xcode/DerivedData/TFImageClassifier-dhgxpyvnqbzvumbysistfzrsimkg/Build/Intermediates.noindex/TFImageClassifier.build/Debug/TFImageClassifier.build/TFImageClassifier-own-target-headers.hmap -Xcc -I/Users/testuser/Library/Developer/Xcode/DerivedData/TFImageClassifier-dhgxpyvnqbzvumbysistfzrsimkg/Build/Intermediates.noindex/TFImageClassifier.build/Debug/TFImageClassifier.build/TFImageClassifier-all-target-headers.hmap -Xcc -iquote -Xcc /Users/testuser/Library/Developer/Xcode/DerivedData/TFImageClassifier-dhgxpyvnqbzvumbysistfzrsimkg/Build/Intermediates.noindex/TFImageClassifier.build/Debug/TFImageClassifier.build/TFImageClassifier-project-headers.hmap -Xcc -I/Users/testuser/Library/Developer/Xcode/DerivedData/TFImageClassifier-dhgxpyvnqbzvumbysistfzrsimkg/Build/Products/Debug/include -Xcc -I/Users/testuser/Library/Developer/Xcode/DerivedData/TFImageClassifier-dhgxpyvnqbzvumbysistfzrsimkg/Build/Intermediates.noindex/TFImageClassifier.build/Debug/TFImageClassifier.build/DerivedSources/x86_64 -Xcc -I/Users/testuser/Library/Developer/Xcode/DerivedData/TFImageClassifier-dhgxpyvnqbzvumbysistfzrsimkg/Build/Intermediates.noindex/TFImageClassifier.build/Debug/TFImageClassifier.build/DerivedSources -Xcc -DDEBUG=1 -Xcc -working-directory/Users/testuser/Desktop/TFImageClassifier -emit-module-doc-path /Users/testuser/Library/Developer/Xcode/DerivedData/TFImageClassifier-dhgxpyvnqbzvumbysistfzrsimkg/Build/Intermediates.noindex/TFImageClassifier.build/Debug/TFImageClassifier.build/Objects-normal/x86_64/ViewController~partial.swiftdoc -serialize-diagnostics-path /Users/testuser/Library/Developer/Xcode/DerivedData/TFImageClassifier-dhgxpyvnqbzvumbysistfzrsimkg/Build/Intermediates.noindex/TFImageClassifier.build/Debug/TFImageClassifier.build/Objects-normal/x86_64/ViewController.dia -module-name TFImageClassifier -emit-module-path /Users/testuser/Library/Developer/Xcode/DerivedData/TFImageClassifier-dhgxpyvnqbzvumbysistfzrsimkg/Build/Intermediates.noindex/TFImageClassifier.build/Debug/TFImageClassifier.build/Objects-normal/x86_64/ViewController~partial.swiftmodule -emit-dependencies-path /Users/testuser/Library/Developer/Xcode/DerivedData/TFImageClassifier-dhgxpyvnqbzvumbysistfzrsimkg/Build/Intermediates.noindex/TFImageClassifier.build/Debug/TFImageClassifier.build/Objects-normal/x86_64/ViewController.d -emit-reference-dependencies-path /Users/testuser/Library/Developer/Xcode/DerivedData/TFImageClassifier-dhgxpyvnqbzvumbysistfzrsimkg/Build/Intermediates.noindex/TFImageClassifier.build/Debug/TFImageClassifier.build/Objects-normal/x86_64/ViewController.swiftdeps -o /Users/testuser/Library/Developer/Xcode/DerivedData/TFImageClassifier-dhgxpyvnqbzvumbysistfzrsimkg/Build/Intermediates.noindex/TFImageClassifier.build/Debug/TFImageClassifier.build/Objects-normal/x86_64/ViewController.o -index-store-path /Users/testuser/Library/Developer/Xcode/DerivedData/TFImageClassifier-dhgxpyvnqbzvumbysistfzrsimkg/Index/DataStore -index-system-modules
/Users/testuser/Desktop/TFImageClassifier/TFImageClassifier/ViewController.swift:63:35: warning: value implicitly copied to the accelerator, use .toDevice() to make transfer explicit
let labels = Tensor<Float>(oneHotAtIndices: numericLabels, depth: 10, axis: -1)
^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
/Users/testuser/Desktop/TFImageClassifier/TFImageClassifier/ViewController.swift:63:35: warning: value implicitly copied to the accelerator, use .toDevice() to make transfer explicit
let labels = Tensor<Float>(oneHotAtIndices: numericLabels, depth: 10, axis: -1)
^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
/Users/testuser/Desktop/TFImageClassifier/TFImageClassifier/ViewController.swift:61:39: warning: class methods always cause a copy to the accelerator, use .toDevice() to make transfer explicit
let (images, numericLabels) = readMnist(imagesFile: imagesFile,
^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
/Users/testuser/Desktop/TFImageClassifier/TFImageClassifier/ViewController.swift:63:34: note: value used here
let labels = Tensor<Float>(oneHotAtIndices: numericLabels, depth: 10, axis: -1)
~~~~~~~~~~~~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
/Users/testuser/Desktop/TFImageClassifier/TFImageClassifier/ViewController.swift:71:30: warning: 'Tensor<Float>' implicitly copied to the accelerator, use .toDevice() to make transfer explicit
var w1 = Tensor<Float>(randomUniform: [784, 30])
~~~~~~~~~~~~^~~~~~~~~~~~~~~~~~~~~~~~~~~
/Users/testuser/Desktop/TFImageClassifier/TFImageClassifier/ViewController.swift:72:30: warning: 'Tensor<Float>' implicitly copied to the accelerator, use .toDevice() to make transfer explicit
var w2 = Tensor<Float>(randomUniform: [30, 10])
~~~~~~~~~~~~^~~~~~~~~~~~~~~~~~~~~~~~~~
/Users/testuser/Desktop/TFImageClassifier/TFImageClassifier/ViewController.swift:73:30: warning: 'Tensor<Float>' implicitly copied to the accelerator, use .toDevice() to make transfer explicit
var b1 = Tensor<Float>(zeros: [1, 30])
~~~~~~~~~~~~^~~~~~~~~~~~~~~~~
IMPLICIT COPY TO ACCEL OF: %341 = apply %339<Int32>(%315) : $@convention(thin) <τ_0_0 where τ_0_0 : AccelerableByTensorFlow> (@owned Array<τ_0_0>) -> @owned TensorHandle<τ_0_0> // users: %345, %344, %342
IMPLICIT COPY TO ACCEL BY: %344 = builtin "__tfop_Fill,$in,$in"(%341 : $TensorHandle<Int32>, %343 : $TensorHandle<Float>) : $TensorHandle<Float> // users: %536, %347
/Users/testuser/Desktop/TFImageClassifier/TFImageClassifier/ViewController.swift:73:30: warning: 'Tensor<Float>' implicitly copied to the accelerator, use .toDevice() to make transfer explicit
var b1 = Tensor<Float>(zeros: [1, 30])
~~~~~~~~~~~~^~~~~~~~~~~~~~~~~
/Users/testuser/Desktop/TFImageClassifier/TFImageClassifier/ViewController.swift:74:30: warning: 'Tensor<Float>' implicitly copied to the accelerator, use .toDevice() to make transfer explicit
var b2 = Tensor<Float>(zeros: [1, 10])
~~~~~~~~~~~~^~~~~~~~~~~~~~~~~
IMPLICIT COPY TO ACCEL OF: %408 = apply %406<Int32>(%382) : $@convention(thin) <τ_0_0 where τ_0_0 : AccelerableByTensorFlow> (@owned Array<τ_0_0>) -> @owned TensorHandle<τ_0_0> // users: %412, %411, %409
IMPLICIT COPY TO ACCEL BY: %411 = builtin "__tfop_Fill,$in,$in"(%408 : $TensorHandle<Int32>, %410 : $TensorHandle<Float>) : $TensorHandle<Float> // users: %536, %414
/Users/testuser/Desktop/TFImageClassifier/TFImageClassifier/ViewController.swift:74:30: warning: 'Tensor<Float>' implicitly copied to the accelerator, use .toDevice() to make transfer explicit
var b2 = Tensor<Float>(zeros: [1, 10])
~~~~~~~~~~~~^~~~~~~~~~~~~~~~~
IMPLICIT COPY TO ACCEL OF: %1253 = struct_extract %1252 : $Bool, #Bool._value // user: %1254
IMPLICIT COPY TO ACCEL BY: cond_br %1253, bb50, bb51 // id: %1254
/Users/testuser/Desktop/TFImageClassifier/TFImageClassifier/ViewController.swift:106:19: warning: method result implicitly copied to the accelerator, use .toDevice() to make transfer explicit
} while i < iterationCount
~~^~~~~~~~~~~~~~~~
/Users/testuser/Desktop/TFImageClassifier/TFImageClassifier/ViewController.swift:61:39: warning: class methods always cause a copy to the accelerator, use .toDevice() to make transfer explicit
let (images, numericLabels) = readMnist(imagesFile: imagesFile,
^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
/Users/testuser/Desktop/TFImageClassifier/TFImageClassifier/ViewController.swift:82:29: note: value used here
let z1 = images ⊗ w1 + b1
IMPLICIT COPY TO ACCEL OF: %1255 = struct_extract %1066 : $Tensor<Float>, #Tensor.handle // user: %1259
IMPLICIT COPY TO ACCEL BY: %559 = builtin "__tfop_MatMul,$in,$in"(%135 : $TensorHandle<Float>, %541 : $TensorHandle<Float>) : $TensorHandle<Float> // users: %574, %570, %568
/Users/testuser/Desktop/TFImageClassifier/TFImageClassifier/ViewController.swift:96:16: warning: method result implicitly copied to the accelerator, use .toDevice() to make transfer explicit
w1 -= dw1 * learningRate
~~~^~~~~~~~~~~~~~~~~~~~~
/Users/testuser/Desktop/TFImageClassifier/TFImageClassifier/ViewController.swift:82:29: note: value used here
let z1 = images ⊗ w1 + b1
IMPLICIT COPY TO ACCEL OF: %1256 = struct_extract %1088 : $Tensor<Float>, #Tensor.handle // user: %1259
IMPLICIT COPY TO ACCEL BY: %570 = builtin "__tfop_Add,$in,$in"(%559 : $TensorHandle<Float>, %542 : $TensorHandle<Float>) : $TensorHandle<Float> // users: %1247, %598, %596, %595, %578, %571
/Users/testuser/Desktop/TFImageClassifier/TFImageClassifier/ViewController.swift:97:16: warning: method result implicitly copied to the accelerator, use .toDevice() to make transfer explicit
b1 -= db1 * learningRate
~~~^~~~~~~~~~~~~~~~~~~~~
/Users/testuser/Desktop/TFImageClassifier/TFImageClassifier/ViewController.swift:82:36: note: value used here
let z1 = images ⊗ w1 + b1
/Users/testuser/Desktop/TFImageClassifier/TFImageClassifier/ViewController.swift:83:22: warning: value implicitly copied to the accelerator, use .toDevice() to make transfer explicit
let h1 = sigmoid(z1)
^~~~~~~~~~~
/Users/testuser/Desktop/TFImageClassifier/TFImageClassifier/ViewController.swift:83:22: warning: value implicitly copied to the accelerator, use .toDevice() to make transfer explicit
let h1 = sigmoid(z1)
^~~~~~~~~~~
IMPLICIT COPY TO ACCEL OF: %1257 = struct_extract %1112 : $Tensor<Float>, #Tensor.handle // user: %1259
IMPLICIT COPY TO ACCEL BY: %632 = builtin "__tfop_MatMul,$in,$in"(%615 : $TensorHandle<Float>, %554 : $TensorHandle<Float>) : $TensorHandle<Float> // users: %646, %642, %640
/Users/testuser/Desktop/TFImageClassifier/TFImageClassifier/ViewController.swift:98:16: warning: method result implicitly copied to the accelerator, use .toDevice() to make transfer explicit
w2 -= dw2 * learningRate
~~~^~~~~~~~~~~~~~~~~~~~~
/Users/testuser/Desktop/TFImageClassifier/TFImageClassifier/ViewController.swift:84:25: note: value used here
let z2 = h1 ⊗ w2 + b2
IMPLICIT COPY TO ACCEL OF: %1258 = struct_extract %1134 : $Tensor<Float>, #Tensor.handle // user: %1259
IMPLICIT COPY TO ACCEL BY: %642 = builtin "__tfop_Add,$in,$in"(%632 : $TensorHandle<Float>, %555 : $TensorHandle<Float>) : $TensorHandle<Float> // users: %1245, %670, %668, %667, %650, %643
/Users/testuser/Desktop/TFImageClassifier/TFImageClassifier/ViewController.swift:99:16: warning: method result implicitly copied to the accelerator, use .toDevice() to make transfer explicit
b2 -= db2 * learningRate
~~~^~~~~~~~~~~~~~~~~~~~~
/Users/testuser/Desktop/TFImageClassifier/TFImageClassifier/ViewController.swift:84:32: note: value used here
let z2 = h1 ⊗ w2 + b2
/Users/testuser/Desktop/TFImageClassifier/TFImageClassifier/ViewController.swift:85:31: warning: value implicitly copied to the accelerator, use .toDevice() to make transfer explicit
let predictions = sigmoid(z2)
^~~~~~~~~~~
/Users/testuser/Desktop/TFImageClassifier/TFImageClassifier/ViewController.swift:85:31: warning: value implicitly copied to the accelerator, use .toDevice() to make transfer explicit
let predictions = sigmoid(z2)
^~~~~~~~~~~
IMPLICIT COPY TO ACCEL OF: %757 = apply %755<Int32>(%736) : $@convention(thin) <τ_0_0 where τ_0_0 : AccelerableByTensorFlow> (@owned Array<τ_0_0>) -> @owned TensorHandle<τ_0_0> // users: %764, %761, %760
IMPLICIT COPY TO ACCEL BY: %761 = builtin "__tfop_Transpose,$in,$in,Tperm"(%626 : $TensorHandle<Float>, %757 : $TensorHandle<Int32>, %759 : $@thin Int32.Type) : $TensorHandle<Float> // users: %776, %772, %770, %762
/Users/testuser/Desktop/TFImageClassifier/TFImageClassifier/ViewController.swift:89:26: warning: method result implicitly copied to the accelerator, use .toDevice() to make transfer explicit
let dw2 = h1.transposed(withPermutations: 1, 0) ⊗ dz2
IMPLICIT COPY TO ACCEL OF: %823 = apply %820<Int32>(%806) : $@convention(thin) <τ_0_0 where τ_0_0 : AccelerableByTensorFlow> (@owned Array<τ_0_0>) -> @owned TensorHandle<τ_0_0> // user: %826
IMPLICIT COPY TO ACCEL BY: %826 = builtin "__tfop_Sum,$in,$in,keep_dims,Tidx"(%701 : $TensorHandle<Float>, %823 : $TensorHandle<Int32>, %824 : $Builtin.Int1, %825 : $@thin Int32.Type) : $TensorHandle<Float> // users: %1241, %1121, %1120, %1118, %1117, %827
/Users/testuser/Desktop/TFImageClassifier/TFImageClassifier/ViewController.swift:90:27: warning: method result implicitly copied to the accelerator, use .toDevice() to make transfer explicit
let db2 = dz2.sum(squeezingAxes: 0)
~~~~^~~~~~~~~~~~~~~~~~~~~
IMPLICIT COPY TO ACCEL OF: %883 = apply %881<Int32>(%862) : $@convention(thin) <τ_0_0 where τ_0_0 : AccelerableByTensorFlow> (@owned Array<τ_0_0>) -> @owned TensorHandle<τ_0_0> // users: %889, %887, %886
IMPLICIT COPY TO ACCEL BY: %887 = builtin "__tfop_Transpose,$in,$in,Tperm"(%565 : $TensorHandle<Float>, %883 : $TensorHandle<Int32>, %885 : $@thin Int32.Type) : $TensorHandle<Float> // users: %897, %895, %894
/Users/testuser/Desktop/TFImageClassifier/TFImageClassifier/ViewController.swift:91:34: warning: method result implicitly copied to the accelerator, use .toDevice() to make transfer explicit
let dz1 = dz2.dot(w2.transposed(withPermutations: 1, 0)) * h1 * (1 - h1)
~~~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
/Users/testuser/Desktop/TFImageClassifier/TFImageClassifier/ViewController.swift:61:39: warning: class methods always cause a copy to the accelerator, use .toDevice() to make transfer explicit
let (images, numericLabels) = readMnist(imagesFile: imagesFile,
^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
/Users/testuser/Desktop/TFImageClassifier/TFImageClassifier/ViewController.swift:92:30: note: value used here
let dw1 = images.transposed(withPermutations: 1, 0) ⊗ dz1
IMPLICIT COPY TO ACCEL OF: %974 = apply %972<Int32>(%953) : $@convention(thin) <τ_0_0 where τ_0_0 : AccelerableByTensorFlow> (@owned Array<τ_0_0>) -> @owned TensorHandle<τ_0_0> // users: %980, %978, %977
IMPLICIT COPY TO ACCEL BY: %978 = builtin "__tfop_Transpose,$in,$in,Tperm"(%158 : $TensorHandle<Float>, %974 : $TensorHandle<Int32>, %976 : $@thin Int32.Type) : $TensorHandle<Float> // users: %992, %988, %986
/Users/testuser/Desktop/TFImageClassifier/TFImageClassifier/ViewController.swift:92:30: warning: method result implicitly copied to the accelerator, use .toDevice() to make transfer explicit
let dw1 = images.transposed(withPermutations: 1, 0) ⊗ dz1
IMPLICIT COPY TO ACCEL OF: %1039 = apply %1036<Int32>(%1022) : $@convention(thin) <τ_0_0 where τ_0_0 : AccelerableByTensorFlow> (@owned Array<τ_0_0>) -> @owned TensorHandle<τ_0_0> // user: %1042
IMPLICIT COPY TO ACCEL BY: %1042 = builtin "__tfop_Sum,$in,$in,keep_dims,Tidx"(%918 : $TensorHandle<Float>, %1039 : $TensorHandle<Int32>, %1040 : $Builtin.Int1, %1041 : $@thin Int32.Type) : $TensorHandle<Float> // users: %1238, %1075, %1074, %1072, %1071, %1043
/Users/testuser/Desktop/TFImageClassifier/TFImageClassifier/ViewController.swift:93:27: warning: method result implicitly copied to the accelerator, use .toDevice() to make transfer explicit
let db1 = dz1.sum(squeezingAxes: 0)
~~~~^~~~~~~~~~~~~~~~~~~~~
IMPLICIT COPY TO ACCEL OF: %1192 = apply %1189<Int32>(%1170) : $@convention(thin) <τ_0_0 where τ_0_0 : AccelerableByTensorFlow> (@owned Array<τ_0_0>) -> @owned TensorHandle<τ_0_0> // user: %1195
IMPLICIT COPY TO ACCEL BY: %1195 = builtin "__tfop_Mean,$in,$in,keep_dims,Tidx"(%1140 : $TensorHandle<Float>, %1192 : $TensorHandle<Int32>, %1193 : $Builtin.Int1, %1194 : $@thin Int32.Type) : $TensorHandle<Float> // users: %1219, %1211, %1210
/Users/testuser/Desktop/TFImageClassifier/TFImageClassifier/ViewController.swift:102:34: warning: method result implicitly copied to the accelerator, use .toDevice() to make transfer explicit
loss = dz2.squared().mean(squeezingAxes: 1, 0).scalarized()
~~~~~~~~~~~~~~^~~~~~~~~~~~~~~~~~~~~~~~~
IMPLICIT COPY TO ACCEL OF: %1209 = apply %1207<Int32>(%1203) : $@convention(thin) <τ_0_0 where τ_0_0 : AccelerableByTensorFlow> (@owned Array<τ_0_0>) -> @owned TensorHandle<τ_0_0> // user: %1211
IMPLICIT COPY TO ACCEL BY: %1211 = builtin "__tfop_Reshape,$in,$in"(%1195 : $TensorHandle<Float>, %1209 : $TensorHandle<Int32>) : $TensorHandle<Float> // user: %1215
/Users/testuser/Desktop/TFImageClassifier/TFImageClassifier/ViewController.swift:102:60: warning: method result implicitly copied to the accelerator, use .toDevice() to make transfer explicit
loss = dz2.squared().mean(squeezingAxes: 1, 0).scalarized()
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^~~~~~~~~~~~
Assertion failed: (retainReleaseBalance >= 0), function balanceRetainReleaseCount, file /Users/danielzheng/swift-new/swift/lib/SILOptimizer/Mandatory/TFPartition.cpp, line 2775.
0 swift 0x00000001040c1d78 llvm::sys::PrintStackTrace(llvm::raw_ostream&) + 40
1 swift 0x00000001040c2486 SignalHandler(int) + 694
2 libsystem_platform.dylib 0x00007fff64a4bf5a _sigtramp + 26
3 libsystem_platform.dylib 0x00007ffeef37ea01 _sigtramp + 2324900545
4 libsystem_c.dylib 0x00007fff64876312 abort + 127
5 libsystem_c.dylib 0x00007fff6483e368 basename_r + 0
6 swift 0x00000001012c93b4 (anonymous namespace)::TFFunctionPartition::insertTensorComputationStartEndTerminate(llvm::ArrayRef<swift::SILValue>) + 12084
7 swift 0x00000001012b73e2 (anonymous namespace)::TFFunctionPartition::partition() + 1986
8 swift 0x00000001012b2558 (anonymous namespace)::TFPartition::run() + 1080
9 swift 0x00000001011fdb53 swift::SILPassManager::runPassOnFunction(unsigned int, swift::SILFunction*) + 1475
10 swift 0x00000001011feaa3 swift::SILPassManager::runFunctionPasses(unsigned int, unsigned int) + 1315
11 swift 0x00000001011ff764 swift::SILPassManager::execute() + 660
12 swift 0x0000000100a3c63b swift::SILPassManager::executePassPipelinePlan(swift::SILPassPipelinePlan const&) + 187
13 swift 0x00000001012089a3 swift::runSILTFPartitionPass(swift::SILModule&) + 99
14 swift 0x00000001008cd8f2 performCompile(swift::CompilerInstance&, swift::CompilerInvocation&, llvm::ArrayRef<char const*>, int&, swift::FrontendObserver*, swift::UnifiedStatsReporter*) + 13490
15 swift 0x00000001008c942e swift::performFrontend(llvm::ArrayRef<char const*>, char const*, void*, swift::FrontendObserver*) + 3310
16 swift 0x000000010087ff83 main + 2051
17 libdyld.dylib 0x00007fff647ca115 start + 1
18 libdyld.dylib 0x0000000000000043 start + 2609078063
Stack dump:
0. Program arguments: /Library/Developer/Toolchains/swift-tensorflow-DEVELOPMENT-2018-04-26-a.xctoolchain/usr/bin/swift -frontend -c -primary-file /Users/testuser/Desktop/TFImageClassifier/TFImageClassifier/ViewController.swift /Users/testuser/Desktop/TFImageClassifier/TFImageClassifier/AppDelegate.swift -target x86_64-apple-macosx10.13 -enable-objc-interop -sdk /Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX10.13.sdk -I /Users/testuser/Library/Developer/Xcode/DerivedData/TFImageClassifier-dhgxpyvnqbzvumbysistfzrsimkg/Build/Products/Debug -F /Users/testuser/Library/Developer/Xcode/DerivedData/TFImageClassifier-dhgxpyvnqbzvumbysistfzrsimkg/Build/Products/Debug -enable-testing -g -module-cache-path /Users/testuser/Library/Developer/Xcode/DerivedData/ModuleCache.noindex -swift-version 4 -enforce-exclusivity=checked -Onone -D DEBUG -serialize-debugging-options -Xcc -I/Users/testuser/Library/Developer/Xcode/DerivedData/TFImageClassifier-dhgxpyvnqbzvumbysistfzrsimkg/Build/Intermediates.noindex/TFImageClassifier.build/Debug/TFImageClassifier.build/swift-overrides.hmap -Xcc -iquote -Xcc /Users/testuser/Library/Developer/Xcode/DerivedData/TFImageClassifier-dhgxpyvnqbzvumbysistfzrsimkg/Build/Intermediates.noindex/TFImageClassifier.build/Debug/TFImageClassifier.build/TFImageClassifier-generated-files.hmap -Xcc -I/Users/testuser/Library/Developer/Xcode/DerivedData/TFImageClassifier-dhgxpyvnqbzvumbysistfzrsimkg/Build/Intermediates.noindex/TFImageClassifier.build/Debug/TFImageClassifier.build/TFImageClassifier-own-target-headers.hmap -Xcc -I/Users/testuser/Library/Developer/Xcode/DerivedData/TFImageClassifier-dhgxpyvnqbzvumbysistfzrsimkg/Build/Intermediates.noindex/TFImageClassifier.build/Debug/TFImageClassifier.build/TFImageClassifier-all-target-headers.hmap -Xcc -iquote -Xcc /Users/testuser/Library/Developer/Xcode/DerivedData/TFImageClassifier-dhgxpyvnqbzvumbysistfzrsimkg/Build/Intermediates.noindex/TFImageClassifier.build/Debug/TFImageClassifier.build/TFImageClassifier-project-headers.hmap -Xcc -I/Users/testuser/Library/Developer/Xcode/DerivedData/TFImageClassifier-dhgxpyvnqbzvumbysistfzrsimkg/Build/Products/Debug/include -Xcc -I/Users/testuser/Library/Developer/Xcode/DerivedData/TFImageClassifier-dhgxpyvnqbzvumbysistfzrsimkg/Build/Intermediates.noindex/TFImageClassifier.build/Debug/TFImageClassifier.build/DerivedSources/x86_64 -Xcc -I/Users/testuser/Library/Developer/Xcode/DerivedData/TFImageClassifier-dhgxpyvnqbzvumbysistfzrsimkg/Build/Intermediates.noindex/TFImageClassifier.build/Debug/TFImageClassifier.build/DerivedSources -Xcc -DDEBUG=1 -Xcc -working-directory/Users/testuser/Desktop/TFImageClassifier -emit-module-doc-path /Users/testuser/Library/Developer/Xcode/DerivedData/TFImageClassifier-dhgxpyvnqbzvumbysistfzrsimkg/Build/Intermediates.noindex/TFImageClassifier.build/Debug/TFImageClassifier.build/Objects-normal/x86_64/ViewController~partial.swiftdoc -serialize-diagnostics-path /Users/testuser/Library/Developer/Xcode/DerivedData/TFImageClassifier-dhgxpyvnqbzvumbysistfzrsimkg/Build/Intermediates.noindex/TFImageClassifier.build/Debug/TFImageClassifier.build/Objects-normal/x86_64/ViewController.dia -module-name TFImageClassifier -emit-module-path /Users/testuser/Library/Developer/Xcode/DerivedData/TFImageClassifier-dhgxpyvnqbzvumbysistfzrsimkg/Build/Intermediates.noindex/TFImageClassifier.build/Debug/TFImageClassifier.build/Objects-normal/x86_64/ViewController~partial.swiftmodule -emit-dependencies-path /Users/testuser/Library/Developer/Xcode/DerivedData/TFImageClassifier-dhgxpyvnqbzvumbysistfzrsimkg/Build/Intermediates.noindex/TFImageClassifier.build/Debug/TFImageClassifier.build/Objects-normal/x86_64/ViewController.d -emit-reference-dependencies-path /Users/testuser/Library/Developer/Xcode/DerivedData/TFImageClassifier-dhgxpyvnqbzvumbysistfzrsimkg/Build/Intermediates.noindex/TFImageClassifier.build/Debug/TFImageClassifier.build/Objects-normal/x86_64/ViewController.swiftdeps -o /Users/testuser/Library/Developer/Xcode/DerivedData/TFImageClassifier-dhgxpyvnqbzvumbysistfzrsimkg/Build/Intermediates.noindex/TFImageClassifier.build/Debug/TFImageClassifier.build/Objects-normal/x86_64/ViewController.o -index-store-path /Users/testuser/Library/Developer/Xcode/DerivedData/TFImageClassifier-dhgxpyvnqbzvumbysistfzrsimkg/Index/DataStore -index-system-modules
1. While running pass #70 SILFunctionTransform "TFPartition" on SILFunction "@$S17TFImageClassifier14ViewControllerC4mainyyF".
for 'main()' at /Users/testuser/Desktop/TFImageClassifier/TFImageClassifier/ViewController.swift:47:5
What should I change with the following source code below?
//
// ViewController.swift
// TFImageClassifier
//
// Created by Shyamal Chandra on 4/26/18.
// Copyright © 2018 Shyamal Chandra. All rights reserved.
//
import Cocoa
import TensorFlow
class ViewController: NSViewController {
override func viewDidLoad() {
super.viewDidLoad()
main()
// Do any additional setup after loading the view.
}
override var representedObject: Any? {
didSet {
// Update the view, if already loaded.
}
}
// Code taken from: https://raw.githubusercontent.com/tensorflow/swift-models/master/MNIST/MNIST.swift
public func readMnist(
imagesFile: String, labelsFile: String
) -> (Tensor<Float>, Tensor<Int32>) {
print("Reading data.")
let imageData =
try! Data(contentsOf: URL(fileURLWithPath: imagesFile)).dropFirst(16)
let labelData =
try! Data(contentsOf: URL(fileURLWithPath: labelsFile)).dropFirst(8)
let images = imageData.map { Float($0) }
let labels = labelData.map { Int32($0) }
let rowCount = Int32(labels.count)
let columnCount = Int32(images.count) / rowCount
print("Constructing data tensors.")
let imagesTensor = Tensor(shape: [rowCount, columnCount], scalars: images)
let labelsTensor = Tensor(labels)
return (imagesTensor.toDevice(), labelsTensor.toDevice())
}
func main() {
// Get script directory. This is necessary for MNIST.swift to work when
// invoked from any directory.
let currentDirectory =
URL(fileURLWithPath: FileManager.default.currentDirectoryPath)
let currentScriptPath =
URL(fileURLWithPath: CommandLine.arguments[0], relativeTo: currentDirectory)
let scriptDirectory = currentScriptPath.appendingPathComponent("..")
// Get training data.
let imagesFile =
scriptDirectory.appendingPathComponent("train-images-idx3-ubyte").path
let labelsFile =
scriptDirectory.appendingPathComponent("train-labels-idx1-ubyte").path
let (images, numericLabels) = readMnist(imagesFile: imagesFile,
labelsFile: labelsFile)
let labels = Tensor<Float>(oneHotAtIndices: numericLabels, depth: 10, axis: -1)
// Hyper-parameters.
let iterationCount: Int32 = 20
let learningRate: Float = 0.2
var loss = Float.infinity
// Parameters.
var w1 = Tensor<Float>(randomUniform: [784, 30])
var w2 = Tensor<Float>(randomUniform: [30, 10])
var b1 = Tensor<Float>(zeros: [1, 30])
var b2 = Tensor<Float>(zeros: [1, 10])
// Training loop.
print("Begin training for \(iterationCount) iterations.")
var i: Int32 = 0
repeat {
// Forward pass.
let z1 = images ⊗ w1 + b1
let h1 = sigmoid(z1)
let z2 = h1 ⊗ w2 + b2
let predictions = sigmoid(z2)
// Backward pass.
let dz2 = predictions - labels
let dw2 = h1.transposed(withPermutations: 1, 0) ⊗ dz2
let db2 = dz2.sum(squeezingAxes: 0)
let dz1 = dz2.dot(w2.transposed(withPermutations: 1, 0)) * h1 * (1 - h1)
let dw1 = images.transposed(withPermutations: 1, 0) ⊗ dz1
let db1 = dz1.sum(squeezingAxes: 0)
// Gradient descent.
w1 -= dw1 * learningRate
b1 -= db1 * learningRate
w2 -= dw2 * learningRate
b2 -= db2 * learningRate
// Update loss.
loss = dz2.squared().mean(squeezingAxes: 1, 0).scalarized()
// Update iteration count.
i += 1
} while i < iterationCount
// Print loss.
print("Loss: \(loss)")
}
}
I am trying to port the tensorflow model python examples into tensorflow for swift examples to create a zoo of examples for others.
I looked at the documentation at: https://www.tensorflow.org/api_docs/swift/ but it is not clear how the new components match with their python counterparts. I also went to the swift source code under stdlib/public/Tensorflow/ or stdlib/private/Tensorflow but there is no table that matches the function in swift to the function in python api. Should I go ahead and experiment by trial-and-error or is this going to be provided by someone in the community? Is the Tensorflow for Swift incomplete because I don't see anything like the tfjs documentation with examples? (e.g. https://js.tensorflow.org/api/0.6.1/ )
The only thing I could find was the video from the TensorFlow Dev Summit:
https://youtu.be/Yze693W4MaU?t=12m59s
Thanks in advance!
We currently diagnose malformed #tfop usage in SIL passes, which mean that we only diagnose them when they are USED, not when they are defined. We should do some basic sanity checks in CSApply.cpp, such as:
Here's an example that we should diagnose in sema that we currently diagnose in SIL passes.
public func invalidAttrTensor(a: Tensor) {
// expected-error @+1 {{attribute 'someAttr' requires a constant argument}}
() = #tfop("foo", someAttr: a)
}
I'm an iOS developer from China. Also, I’m a Swift lover and the core translator of SwiftGG translation group.
We translated articles about iOS or Swift from English blog into Chinese and published them on our website for free. We have translated Apple’s The Swift Programming Language into Chinese and update its version. It's also listed on Apple’s website as a good example(see Translations section).
Now, we fork this repo and translate it into Chinese. So can you add our Chinese version of Google Swift Guide(maybe not this link when we publish release) on your English website(like adding a section of translation). And we promise this website won't add any ads and can be read freely by everyone.
func xor(_ x: Float, _ y: Float) -> Float {
// ==> When this line is
let x = Tensor2D<Float>([[x, y]])
// ==> It emits "'Tensor1D<Float>' implicitly copied to the accelerator, use .toDevice() to make transfer explicit".
// ==> When this line is
let x = Tensor2D<Float>([[x, y]]).toDevice()
// ==> It emits a send/receive error.
let w1 = Tensor2D<Float>(
[[-1.83586664, -0.20809225, 0.47667537, 1.90780607],
[-1.83523219, -0.51167348, 0.15490439, 1.91018065]])
let b1 = Tensor2D<Float>(
[[2.54353216, 0.25132703, -0.16503136, -0.85754058]])
let w2 = Tensor2D<Float>(
[[3.04350065], [0.35590511], [-0.3252157], [3.49349223]])
let b2 = Tensor2D<Float>([[-0.74635993]])
let o1 = tanh(x ⊗ w1 + b1)
let y = tanh(o1 ⊗ w2 + b2)
return y.scalars[0]
}
xor(0.0, 0.0)
xor(0.0, 1.0)
On Ubuntu 16.04 and 14.04, I installed:
sudo apt-get install python
curl https://bootstrap.pypa.io/get-pip.py -o get-pip.py && python get-pip.py
pip install numpy
In the REPL, Python.import("numpy")
fails:
$ usr/bin/swift -Iusr/lib/swift/clang/include
Welcome to Swift version 4.2-dev (LLVM 04bdb56f3d, Clang b44dbbdf44, Swift 035acb6e14). Type :help for assistance.
1> import Python
2> let np = Python.import("numpy")
Fatal error: 'try!' expression unexpectedly raised an error: exception:
Importing the multiarray numpy extension module failed. Most
likely you are trying to import a failed build of numpy.
If you're working with a numpy git repo, try `git clean -xdf` (removes all
files not under version control). Otherwise reinstall numpy.
Original error was: /usr/local/lib/python2.7/dist-packages/numpy/core/multiarray.so: undefined symbol: PyExc_UserWarning
: file /home/danielzheng/swift-dan/swift/stdlib/public/core/ErrorType.swift, line 184
Current stack trace:
0 libswiftCore.so 0x00007ffff2307cc0 _swift_stdlib_reportFatalErrorInFile + 215
1 libswiftCore.so 0x00007ffff22aa357 <unavailable> + 4268887
2 libswiftCore.so 0x00007ffff2022f0d <unavailable> + 1617677
3 libswiftCore.so 0x00007ffff22aa1ed <unavailable> + 4268525
4 libswiftCore.so 0x00007ffff22aa157 <unavailable> + 4268375
5 libswiftCore.so 0x00007ffff2023768 <unavailable> + 1619816
6 libswiftCore.so 0x00007ffff22aa100 <unavailable> + 4268288
7 libswiftCore.so 0x00007ffff2022f0d <unavailable> + 1617677
8 libswiftCore.so 0x00007ffff21e4988 <unavailable> + 3459464
9 libswiftCore.so 0x00007ffff20b1c5a <unavailable> + 2202714
Process 70759 stopped
* thread #1, name = 'repl_swift', stop reason = signal SIGILL: illegal instruction operand
frame #0: 0x00007ffff21e4990 libswiftCore.so`function signature specialization <Arg[1] = Exploded> of Swift._assertionFailure(_: Swift.StaticString, _: Swift.String, file: Swift.StaticString, line: Swift.UInt, flags: Swift.UInt32) -> Swift.Never + 144
libswiftCore.so`function signature specialization <Arg[1] = Exploded> of Swift._assertionFailure(_: Swift.StaticString, _: Swift.String, file: Swift.StaticString, line: Swift.UInt, flags: Swift.UInt32) -> Swift.Never:
-> 0x7ffff21e4990 <+144>: ud2
0x7ffff21e4992: nopw %cs:(%rax,%rax)
libswiftCore.so`function signature specialization <Arg[3] = Dead> of generic specialization <Swift._ContiguousArrayBuffer<Swift.UInt16>, Swift._IgnorePointer<Swift.UInt16>> of (extension in Swift):Swift._ArrayBufferProtocol._arrayOutOfPlaceUpdate<A where A1: Swift._PointerFunction, A.Element == A1.Element>(inout Swift._ContiguousArrayBuffer<A.Element>, Swift.Int, Swift.Int, A1) -> ():
0x7ffff21e49a0 <+0>: pushq %rbp
0x7ffff21e49a1 <+1>: movq %rsp, %rbp
Target 0: (repl_swift) stopped.
np: Python.PyValue =terminate called after throwing an instance of 'std::logic_error'
what(): basic_string::_M_construct null not valid
[1] 70751 abort (core dumped) usr/bin/swift
I'm using the 05-03 pre-built packages.
Some other info:
Python.import("numpy")
works fine using the interpreter/compiler.Python.import("os")
, Python.import("datetime")
requests
, installed via pip install requests
).
Python.import("requests")
(by @lattner)
This code should work without copies:
// Sigmoid shouldn't cause copies. This should compile with no copy warnings/errors.
public func testSigmoid(x: Tensor<Float>, y: Tensor<Float>) -> (Tensor<Float>, Tensor<Float>) {
let a = sigmoid(x.toDevice()).toHost()
let b = sigmoid(y.toDevice()).toHost()
return (a, b)
}
Again, this is brought over from the mailing list - attempting to edit the original message down to something relevant to this specific issue.
import Python
fails when I include the -lTFExample
flag to the REPL invocation to link to the built TFExample
library so I can access third-party packages.
1> import Python
error: Couldn't lookup symbols:
_swift_FORCE_LOAD_$_swiftPython
If I don't include that flag, I'll get runtime errors when trying to interact with imported third-party libraries which is expected because the symbols will be missing.
This is the catch model with all public
s removed. The original version runs ok, but it triggers an internal error when public
s are removed.
import TensorFlow
// Note: This is a work in progress and training doesn't quite work.
// Here are areas for improvement:
// - Adopt a more principled reinforcement learning algorithm (e.g. policy
// gradients). The algorithm should perform some tensor computation (not a
// purely table-based approach).
// - `CatchAgent.step` calculates loss from the wrong reward. It uses the reward
// at time `t+1`, but should actually use the reward from time `t`. This
// requires saving the previous reward somehow.
// - The current back-propagation calculation may be incorrect.
// - It may be better to use a different initialization scheme for the layers of
// `CatchAgent`.
extension Sequence {
/// Returns elements' descriptions joined by a separator.
func description(joinedBy separator: String) -> String {
return map{"\($0)"}.joined(separator: separator)
}
}
typealias Observation = ShapedArray<Float>
typealias Reward = Float
protocol Environment {
associatedtype Action : Equatable
mutating func step(
with action: Action
) -> (observation: Observation, reward: Reward)
mutating func reset() -> Observation
}
protocol Agent {
associatedtype Action : Equatable
mutating func step(
with state: (observation: Observation, reward: Reward)
) -> Action
}
struct CatchAgent : Agent {
typealias Action = CatchAction
@_versioned var layer1: FullyConnectedLayer<Float>
@_versioned var layer2: FullyConnectedLayer<Float>
@_versioned let learningRate: Float
}
extension CatchAgent {
init(learningRate: Float) {
layer1 = FullyConnectedLayer(inputCount: 3, outputCount: 50)
layer2 = FullyConnectedLayer(inputCount: 50, outputCount: 3)
self.learningRate = learningRate
}
/// Performs one "step" (or parameter update) based on the specified
/// observation and reward.
@inline(never)
mutating func step(
with state: (observation: Observation, reward: Reward)
) -> Action {
// NOTE: using `self.layer1` directly causes a send error. This is likely
// because the function is mutating so referencing `self.layer1` produces a
// load.
// The workaround here is to:
// - Bind `self.layer1` to a local variable.
// - Perform tensor computations using the local variable.
// - After all computations, set `self.layer1` to the local variable.
// Initial setup.
let (observation, reward) = state
var layer1 = self.layer1
var layer2 = self.layer2
let learningRate = self.learningRate
// Inference.
let input = Tensor<Float>(observation).rankLifted()
let pred1 = layer1.applied(to: input)
let output1 = sigmoid(pred1)
let pred2 = layer2.applied(to: output1)
let output2 = sigmoid(pred2)
let maxIndex = output2.argmax()
// Back-propagation.
let dOutput2 = output2 * (1 - output2)
let (_, dParameters2) = layer2.gradient(for: output1,
backpropagating: dOutput2)
let dOutput1 = output1 * (1 - output1)
let (_, dParameters1) = layer1.gradient(for: input,
backpropagating: dOutput1)
// Negative log loss.
// FIXME: Loss is calculated from the wrong reward! It should be calculated
// from the previous state. Fixing this is *most likely* to improve
// training.
// FIXME: indexing with `maxIndex` directly causes a send.
// let loss = -log(output2[maxIndex]) * reward
// FIXME: This infers to the variadic `max` function, which acts as a no-op.
// let loss = -log(output2.max()) * reward
let maxValue: Float = output2.max()
let loss = -log(Tensor(maxValue)) * reward
layer1.parameters.update(with: dParameters1,
by: { $0 -= learningRate * loss * $1 })
layer2.parameters.update(with: dParameters2,
by: { $0 -= learningRate * loss * $1 })
self.layer1 = layer1
self.layer2 = layer2
let action = CatchAction(rawValue: Int(maxIndex))!
return action
}
/// Returns the perfect action, given an observation.
/// If the ball is left of the paddle, returns `left`.
/// If the ball is right of the paddle, returns `right`.
/// Otherwise, returns `none`.
/// Note: This function is for reference and is not used by `CatchAgent`.
func perfectAction(for observation: Observation) -> Action {
let paddleX = observation.scalars[0]
let ballX = observation.scalars[1]
if paddleX > ballX {
return .right
} else if paddleX < ballX {
return .left
}
return .none
}
/// Returns a random action.
/// Note: This function is for reference and is not used by `CatchAgent`.
func randomAction() -> Action {
let id = Int(RandomState.global.generate()) % 3
return CatchAction(rawValue: id)!
}
}
enum CatchAction : Int {
case none
case left
case right
}
struct Position : Equatable, Hashable {
var x: Int
var y: Int
}
struct CatchEnvironment : Environment {
typealias Action = CatchAction
let rowCount: Int
let columnCount: Int
var ballPosition: Position
var paddlePosition: Position
}
extension CatchEnvironment {
init(rowCount: Int, columnCount: Int, seed: UInt32? = nil) {
self.rowCount = rowCount
self.columnCount = columnCount
self.ballPosition = Position(x: 0, y: 0)
self.paddlePosition = Position(x: 0, y: 0)
reset()
}
mutating func step(
with action: CatchAction
) -> (observation: Observation, reward: Float) {
// Update state.
switch action {
case .left where paddlePosition.x > 0:
paddlePosition.x -= 1
case .right where paddlePosition.x < columnCount - 1:
paddlePosition.x += 1
default:
break
}
ballPosition.y += 1
// Get reward.
let currentReward = reward()
// Return observation and reward.
if ballPosition.y == rowCount {
return (reset(), currentReward)
}
return (observation(), currentReward)
}
/// Resets the ball to be in a random column in the first row, and resets the
/// paddle to be in the middle column of the bottom row.
@discardableResult
mutating func reset() -> Observation {
let randomColumn = Int(RandomState.global.generate()) % columnCount
ballPosition = Position(x: randomColumn, y: 0)
paddlePosition = Position(x: columnCount / 2, y: rowCount - 1)
return observation()
}
/// If the ball is in the bottom row:
/// - Return 1 if the horizontal distance from the ball to the paddle is less
/// than or equal to 1.
/// - Otherwise, return -1.
/// If the ball is not in the bottom row, return 0.
func reward() -> Float {
if ballPosition.y == rowCount {
return abs(ballPosition.x - paddlePosition.x) <= 1 ? 1 : -1
}
return 0
}
/// Returns an obeservation of the game grid.
func observation() -> Observation {
return ShapedArray<Float>(
shape: [3],
scalars: [Float(ballPosition.x) / Float(columnCount),
Float(ballPosition.y) / Float(rowCount),
Float(paddlePosition.x) / Float(columnCount)]
)
}
/// Returns the game grid as a 2D matrix where all scalars are 0 except the
/// positions of the ball and paddle, which are 1.
var grid: ShapedArray<Float> {
var result = ShapedArray<Float>(shape: [rowCount, columnCount], repeating: 0)
result[ballPosition.y][ballPosition.x] = ShapedArraySlice(1 as Float)
result[paddlePosition.y][paddlePosition.x] = ShapedArraySlice(1 as Float)
return result
}
}
extension CatchEnvironment : CustomStringConvertible {
var description: String {
return grid.description(joinedBy: "\n")
}
}
func main() {
// Set global seed.
RandomState.global.seed(with: 42)
// Setup environment and agent.
var environment = CatchEnvironment(rowCount: 5, columnCount: 5)
var action: CatchAction = .none
var agent = CatchAgent(learningRate: 0.01)
var gameCount = 0
var winCount = 0
var totalWinCount = 0
let maxIterations = 1000
repeat {
// NOTE: the next line is the only one running tensor code.
let state = environment.step(with: action)
action = agent.step(with: state)
if !state.reward.isZero {
print("Game \(gameCount)", state.reward)
gameCount += 1
if state.reward > 0 {
winCount += 1
totalWinCount += 1
}
if gameCount % 20 == 0 {
print("Win rate (last 20 games): \(Float(winCount) / 20)")
print("""
Win rate (total): \(Float(totalWinCount) / Float(gameCount)) \
[\(totalWinCount)/\(gameCount)]
""")
winCount = 0
}
}
} while gameCount < maxIterations
print("""
Win rate (final): \(Float(totalWinCount) / Float(gameCount)) \
[\(totalWinCount)/\(gameCount)]
""")
}
main()
Running this using swift -O catch.swift
gives an internal error. The _$S5catch8PositionVMn
symbol is not linked. I haven't tested the compiler mode. This may be related to JIT only.
<unknown>:0: error: fatal error encountered during compilation; please file a bug report with your project and the crash log
<unknown>:0: note: Program used external function '_$S5catch8PositionVMn' which could not be resolved!
0 swift 0x000000010b1e9d78 llvm::sys::PrintStackTrace(llvm::raw_ostream&) + 40
1 swift 0x000000010b1ea486 SignalHandler(int) + 694
2 libsystem_platform.dylib 0x00007fff77a10f5a _sigtramp + 26
3 swift 0x000000010bf1eb40 (anonymous namespace)::DarwinX86AsmBackend::getCompactUnwindRegNum(unsigned int) const::CU64BitRegs + 229990
4 libsystem_c.dylib 0x00007fff777ae1ae abort + 127
5 swift 0x00000001079f6a67 swift::performFrontend(llvm::ArrayRef<char const*>, char const*, void*, swift::FrontendObserver*)::$_0::__invoke(void*, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const&, bool) + 519
6 swift 0x000000010b18574c llvm::report_fatal_error(llvm::Twine const&, bool) + 252
7 swift 0x0000000108556f2c llvm::RuntimeDyldImpl::resolveExternalSymbols() + 2956
8 swift 0x0000000108555a09 llvm::RuntimeDyldImpl::resolveRelocations() + 201
9 swift 0x0000000108545111 llvm::MCJIT::finalizeObject() + 433
10 swift 0x0000000107a395c9 swift::RunImmediately(swift::CompilerInstance&, std::__1::vector<std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> >, std::__1::allocator<std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > > > const&, swift::IRGenOptions&, swift::SILOptions const&) + 3017
11 swift 0x00000001079f5f94 performCompile(swift::CompilerInstance&, swift::CompilerInvocation&, llvm::ArrayRef<char const*>, int&, swift::FrontendObserver*, swift::UnifiedStatsReporter*) + 15188
12 swift 0x00000001079f142e swift::performFrontend(llvm::ArrayRef<char const*>, char const*, void*, swift::FrontendObserver*) + 3310
13 swift 0x00000001079a7f83 main + 2051
14 libdyld.dylib 0x00007fff77702015 start + 1
15 libdyld.dylib 0x000000000000000e start + 2291130362
Stack dump:
The emitted IR has this symbol, however.
@"$S5catch8PositionVMn" = hidden constant <{ i32, i32, i32, i32, i32, i32 }> <{ i32 262225, i32 trunc (i64 sub (i64 ptrtoint (<{ i32, i32, i32 }>* @"$S5catchMXM" to i64), i64 ptrtoint (i32* getelementptr
inbounds (<{ i32, i32, i32, i32, i32, i32 }>, <{ i32, i32, i32, i32, i32, i32 }>* @"$S5catch8PositionVMn", i32 0, i32 1) to i64)) to i32), i32 trunc (i64 sub (i64 ptrtoint ([9 x i8]* @19 to i64), i64 pt
rtoint (i32* getelementptr inbounds (<{ i32, i32, i32, i32, i32, i32 }>, <{ i32, i32, i32, i32, i32, i32 }>* @"$S5catch8PositionVMn", i32 0, i32 2) to i64)) to i32), i32 trunc (i64 sub (i64 ptrtoint (%sw
ift.type* ()* @"$S5catch8PositionVMa" to i64), i64 ptrtoint (i32* getelementptr inbounds (<{ i32, i32, i32, i32, i32, i32 }>, <{ i32, i32, i32, i32, i32, i32 }>* @"$S5catch8PositionVMn", i32 0, i32 3) to
i64)) to i32), i32 2, i32 2 }>, section "__TEXT,__const", align 4
I got an assert failure when trying to build the following snippet, full backtrace below. This seems to work fine when writing functions that return a single tensor rather than a pair of tensors.
import TensorFlow
@_inlineable @inline(__always)
public func testOp<TFloat : TensorProtocol, TInt64 : TensorProtocol>(inTensor: TInt64) -> (TInt64, TFloat)
{
return #tfop("TestOp", inTensor)
}
F0501 09:17:49.304163 9029 logging.cc:78] assert.h assertion failed at src/swift/lib/IRGen/Explosion.h:200 in llvm::Type *swift::irgen::ExplosionSchema::Element::getScalarType() const: isScalar()
*** Check failure stack trace: ***
@ 0x558450f8af4f base_logging::LogMessage::SendToLog()
@ 0x558450f8b734 base_logging::LogMessage::Flush()
@ 0x558450f8d9f9 base_logging::LogMessageFatal::~LogMessageFatal()
@ 0x558450f89a76 __assert_fail
@ 0x55844bceb128 swift::irgen::emitBuiltinCall()
@ 0x55844bcbb7bb (anonymous namespace)::IRGenSILFunction::visitSILBasicBlock()
@ 0x55844bcad43a swift::irgen::IRGenModule::emitSILFunction()
@ 0x55844bbae1db swift::irgen::IRGenerator::emitGlobalTopLevel()
@ 0x55844bb4d6d6 performIRGeneration()
@ 0x55844bb4defe swift::performIRGeneration()
@ 0x55844baf96a3 generateIR()
@ 0x55844baed025 performCompile()
@ 0x55844baea14d swift::performFrontend()
@ 0x55844ba9b02e main
@ 0x7f1a94da9bbd __libc_start_main
@ 0x55844ba94029 _start
I run the prebuilt binary.
let amount = 3
let count = "I have \(amount) apples."
lldb: /home/danielzheng/swift-build/llvm/lib/IR/Value.cpp:404: void llvm::Value::doRAUW(llvm::Value *, bool): Assertion `New->getType() == getType() && "replaceAllUses of value with new value of different type!"' failed.
Stack dump:
0. Running pass 'Function Pass Manager' on module 'lldb_module'.
1. Running pass 'Swift ARC contraction' on function '@"$SSS19stringInterpolationS2Sd_tcfCTf4gd_n"'
[1] 12544 abort (core dumped) bin/swift
This might be related to the older item #132
When trying to build swift-format I get the following error:
`
λ ~/Projects/swift-format/ format swift build
warning: target 'CCommonMark' in package 'swift-format' contains no valid source files
warning: invalid duplicate target dependency declaration 'swift-build' in target 'FunctionalPerformanceTests'
warning: invalid duplicate target dependency declaration 'swift-package' in target 'FunctionalPerformanceTests'
/Users/patrick/Projects/swift-format/Sources/CommonMark/CMarkInterop.swift:13:8: error: no such module 'CCommonMark'
import CCommonMark
^
/Users/patrick/Projects/swift-format/Sources/CommonMark/CMarkInterop.swift:13:8: error: no such module 'CCommonMark'
import CCommonMark
^
/Users/patrick/Projects/swift-format/Sources/CommonMark/CMarkInterop.swift:13:8: error: no such module 'CCommonMark'
import CCommonMark
^
/Users/patrick/Projects/swift-format/Sources/CommonMark/CMarkInterop.swift:13:8: error: no such module 'CCommonMark'
import CCommonMark
^
`
Currently one can do
let (a, b): (TensorHandle<Scalar>, TensorHandle<Scalar>) = #tfop("some_multi_return_op", ...)
But not
let (a, b): (Tensor<Scalar>, Tensor<Scalar>) = #tfop("some_multi_return_op", ...)
We should support unwrapping when #tfop returns a tuple. This is tricky, because there's no simple DAG we can turn the AST into. We'd probably want to rewrite it to two statements, or a closure application.
let (aHandle, bHandle): (TensorHandle<Scalar>, TensorHandle<Scalar>) = #tfop("some_multi_return_op", ...)
let (a, b) = (Tensor(handle: aHandle), Tensor(handle: bHandle))
Currently compiler runtime functions (_swift_tfc_XXX
) are declared public, and the only reason it is public is because of the following synchronous runtime test:
https://github.com/google/swift/blob/tensorflow/test/TensorFlowRuntime/sync_runtime.swift
This is the only remaining test that is using hard-coded protos. Hard-coded protos may break any time in the future whenever the proto format changes in TF. Moreover, this test is no longer reproducible because the entry name is "the_function", which is the name we used in the very very early days.
We should consider removing this test and make compiler entry points @_versioned instead of public.
Hi, I've tried to build the REPL from sources on Ubuntu 16.04 and Mac OS. However, the built REPL throws the following error when creating a simple tensor:
(swift) import TensorFlow
(swift) var a = Tensor([1, 1])
2018-04-30 10:19:38.802384: I tensorflow/core/platform/cpu_feature_guard.cc:141] Your CPU supports instructions that this TensorFlow binary was not compiled to use: SSE4.1 SSE4.2 AVX
// a : Tensor<Double> = [1.0, 1.0]
(swift) var b = Tensor([[1, 1], [1, 1]])
!!! Compiler bug -- Tensor op builtin __tfop_Pack,$in cannot be lowered to LLVM IR !!!
/mnt/swift-source/build/Ninja-RelWithDebInfoAssert/swift-linux-x86_64/bin/swift[0x3f1ed4f]
/mnt/swift-source/build/Ninja-RelWithDebInfoAssert/swift-linux-x86_64/bin/swift[0x3f1f056]
/lib/x86_64-linux-gnu/libpthread.so.0(+0x11390)[0x7f603cfa2390]
/lib/x86_64-linux-gnu/libc.so.6(gsignal+0x38)[0x7f603717c428]
/lib/x86_64-linux-gnu/libc.so.6(abort+0x16a)[0x7f603717e02a]
[0x7f603d29c1e8]
/mnt/swift-source/build/Ninja-RelWithDebInfoAssert/swift-linux-x86_64/bin/swift[0x1034a1e]
/mnt/swift-source/build/Ninja-RelWithDebInfoAssert/swift-linux-x86_64/bin/swift[0x10389d2]
/mnt/swift-source/build/Ninja-RelWithDebInfoAssert/swift-linux-x86_64/bin/swift[0x4ff6d4]
/mnt/swift-source/build/Ninja-RelWithDebInfoAssert/swift-linux-x86_64/bin/swift[0x4fb035]
/mnt/swift-source/build/Ninja-RelWithDebInfoAssert/swift-linux-x86_64/bin/swift[0x4de8b2]
/mnt/swift-source/build/Ninja-RelWithDebInfoAssert/swift-linux-x86_64/bin/swift[0x4dd09c]
/mnt/swift-source/build/Ninja-RelWithDebInfoAssert/swift-linux-x86_64/bin/swift[0x48d7bd]
/lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xf0)[0x7f6037167830]
/mnt/swift-source/build/Ninja-RelWithDebInfoAssert/swift-linux-x86_64/bin/swift[0x48b019]
Stack dump:
0. Program arguments: /mnt/swift-source/build/Ninja-RelWithDebInfoAssert/swift-linux-x86_64/bin/swift -frontend -repl -disable-objc-interop -color-diagnostics -module-name REPL
1. while processing REPL source:
import TensorFlow
var a = Tensor([1, 1])
var b = Tensor([[1, 1], [1, 1]])
Aborted (core dumped)
To replicate, build with build-script --enable-tensorflow
, then start the compiler-integrated REPL with:
build/buildbot_osx/swift-macosx-x86_64/bin/swift
build/buildbot_linux/swift-linux-x86_64/bin/swift
Then, run:
(swift) import TensorFlow
(swift) var a = Tensor([1, 1])
Here's a long-standing issue that we haven't yet visited. The following code results in unexpected behavior.
import Foundation
import TensorFlow
var x = Tensor([[1, 2], [3, 4]])
print("start time", Date())
for _ in 0...100000 {
x += x ⊗ x
}
print("finished a big loop at", Date())
for _ in 0...9 {
x += x ⊗ x
}
print("end time", Date())
When you run this program, the print statement in the middle does not block async execution, so "middle" gets printed immediately after "start". This is because of async execution and that print doesn't have any data dependency. This goes against the mental model for writing such imperative code.
$ swift -O test.swift
start time 2018-05-08 01:26:30 +0000
finished a big loop at 2018-05-08 01:26:30 +0000
end time 2018-05-08 01:26:34 +0000
In the programming model, users should not be aware of "there is a graph" or "graph running asynchronously". The expected model should block the async execution (without send/receive) when there's a side-effectful statement in the middle of a graph like print. Blocking for a print like this within tensor code is slow for performance, but it's an expected feature for any imperative code and is important for debugging.
Start.isValid() == End.isValid() && "Start and end should either both be valid or both be invalid!"), function SourceRange, file /Library/Caches/swift-build/swift/include/swift/Basic/SourceLoc.h, line 93
Related code path to look at:
Line 3670 in a988b64
(by @lattner)
We've been using/abusing BuiltinInst for tensor operations, which has worked somewhat well. However, it has some awkward parts to it:
Instead of using them for this, it would make sense for introduce a new SIL instruction specific for this purpose which can store the attributes as actual constants in a tail allocated array. This would make things more efficient and direct.
We can introduce this at any time, they can be made by SILTensorOpInfo::checkAndDiagnoseOperands. However, it would be the biggest win to do it when deabstraction is handling all the constants, because that would eliminate all the extra SIL very early in the compiler.
import TensorFlow
var a = Tensor([[1.0, 2.0], [3.0, 4.0]])
let dot = a.dot(a)
func foo() { print(dot) }
foo()
➜ swift -O test.swift
0 swift 0x00000001050b7d78 llvm::sys::PrintStackTrace(llvm::raw_ostream&) + 40
1 swift 0x00000001050b8486 SignalHandler(int) + 694
2 libsystem_platform.dylib 0x00007fff77631f5a _sigtramp + 26
3 libsystem_platform.dylib 000000000000000000 _sigtramp + 2291982528
4 libswiftTensorFlow.dylib 0x0000000111e5cfde $S10TensorFlow0A0V11descriptionSSvg + 606
5 libswiftTensorFlow.dylib 0x0000000111e6aea4 $S10TensorFlow0A0VyxGs23CustomStringConvertibleAAsAEP11descriptionSSvgTW + 20
6 libswiftCore.dylib 0x000000011333230a $Ss15_print_unlockedyyx_q_zts16TextOutputStreamR_r0_lFTf4gn_n + 794
7 libswiftCore.dylib 0x0000000113384d58 $Ss6_print_9separator10terminator2toySayypG_S2Sxzts16TextOutputStreamRzlFs7_StdoutV_Tg5Tf4ggXgXn_n + 200
8 libswiftCore.dylib 0x0000000113385beb $Ss5print_9separator10terminatoryypd_S2StFTf4ggXgX_nTm + 331
9 libswiftCore.dylib 0x0000000113247e80 $Ss5print_9separator10terminatoryypd_S2StFTm + 32
10 libswiftCore.dylib 0x0000000113247d8f $Ss5print_9separator10terminatoryypd_S2StF + 31
11 libswiftCore.dylib 0x000000010fa6e20e $Ss5print_9separator10terminatoryypd_S2StF + 4236403870
12 swift 0x000000010241503d llvm::MCJIT::runFunction(llvm::Function*, llvm::ArrayRefllvm::GenericValue) + 461
13 swift 0x0000000102418b61 llvm::ExecutionEngine::runFunctionAsMain(llvm::Function*, std::__1::vector<std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator >, std::__1::allocator<std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator > > > const&, char const* const*) + 1313
14 swift 0x0000000101907827 swift::RunImmediately(swift::CompilerInstance&, std::__1::vector<std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator >, std::__1::allocator<std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator > > > const&, swift::IRGenOptions&, swift::SILOptions const&) + 3623
15 swift 0x00000001018c3f94 performCompile(swift::CompilerInstance&, swift::CompilerInvocation&, llvm::ArrayRef<char const*>, int&, swift::FrontendObserver*, swift::UnifiedStatsReporter*) + 15188
16 swift 0x00000001018bf42e swift::performFrontend(llvm::ArrayRef<char const*>, char const*, void*, swift::FrontendObserver*) + 3310
17 swift 0x0000000101875f83 main + 2051
18 libdyld.dylib 0x00007fff773b0115 start + 1
19 libdyld.dylib 0x000000000000000b start + 2294611703
Stack dump:
0. Program arguments: /Library/Developer/Toolchains/swift-tensorflow-DEVELOPMENT-2018-04-26-a.xctoolchain/usr/bin/swift -frontend -interpret test.swift -enable-objc-interop -sdk /Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX10.13.sdk -O -color-diagnostics -module-name test
[1] 18529 segmentation fault swift -O test.swift
I found some tests here: https://github.com/google/swift/blob/tensorflow/test/TensorFlow/tensor_autodiff.swift
But neither XCode nor the CLI tool is able to run it.
Error message during compiling.
Assertion failed: (isParsing || !getFunction().hasQualifiedOwnership()), function createRetainValue, file /Users/danielzheng/swift-new/swift/include/swift/SIL/SILBuilder.h, line 984.
0 swift 0x0000000112bf4d78 llvm::sys::PrintStackTrace(llvm::raw_ostream&) + 40
1 swift 0x0000000112bf5486 SignalHandler(int) + 694
2 libsystem_platform.dylib 0x00007fff52e6bf5a _sigtramp + 26
3 libsystem_platform.dylib 000000000000000000 _sigtramp + 2904113344
4 libsystem_c.dylib 0x00007fff52c96312 abort + 127
5 libsystem_c.dylib 0x00007fff52c5e368 basename_r + 0
6 swift 0x00000001100fb764 swift::SILBuilder::createRetainValue(swift::SILLocation, swift::SILValue, swift::RefCountingInst::Atomicity) + 484
7 swift 0x000000010fd1e576 swift::SILClonerswift::SILInliner::visitRetainValueInst(swift::RetainValueInst*) + 390
8 swift 0x000000010fd03fb3 swift::SILClonerswift::SILInliner::visitSILBasicBlock(swift::SILBasicBlock*) + 83
9 swift 0x000000010fd036d8 swift::SILInliner::inlineFunction(swift::FullApplySite, llvm::ArrayRefswift::SILValue) + 1656
10 swift 0x000000010fdba4b9 runOnFunctionRecursively(swift::SILFunction*, swift::FullApplySite, swift::SILOptions::LinkingMode, llvm::DenseSet<swift::SILFunction*, llvm::DenseMapInfoswift::SILFunction* >&, llvm::ImmutableSet<swift::SILFunction*, llvm::ImutContainerInfoswift::SILFunction* >::Factory&, llvm::ImmutableSet<swift::SILFunction*, llvm::ImutContainerInfoswift::SILFunction* >, swift::ClassHierarchyAnalysis*, swift::SILInliner::InlineKind, std::__1::function<bool (swift::FullApplySite, swift::SILFunction const&)> const&) + 5545
11 swift 0x000000010fdb9f64 runOnFunctionRecursively(swift::SILFunction*, swift::FullApplySite, swift::SILOptions::LinkingMode, llvm::DenseSet<swift::SILFunction*, llvm::DenseMapInfoswift::SILFunction* >&, llvm::ImmutableSet<swift::SILFunction*, llvm::ImutContainerInfoswift::SILFunction* >::Factory&, llvm::ImmutableSet<swift::SILFunction*, llvm::ImutContainerInfoswift::SILFunction* >, swift::ClassHierarchyAnalysis*, swift::SILInliner::InlineKind, std::__1::function<bool (swift::FullApplySite, swift::SILFunction const&)> const&) + 4180
12 swift 0x000000010fdb9f64 runOnFunctionRecursively(swift::SILFunction*, swift::FullApplySite, swift::SILOptions::LinkingMode, llvm::DenseSet<swift::SILFunction*, llvm::DenseMapInfoswift::SILFunction* >&, llvm::ImmutableSet<swift::SILFunction*, llvm::ImutContainerInfoswift::SILFunction* >::Factory&, llvm::ImmutableSet<swift::SILFunction*, llvm::ImutContainerInfoswift::SILFunction* >, swift::ClassHierarchyAnalysis*, swift::SILInliner::InlineKind, std::__1::function<bool (swift::FullApplySite, swift::SILFunction const&)> const&) + 4180
13 swift 0x000000010fdb8e9d swift::inlineForTFDeabstraction(swift::SILFunction&, std::__1::function<bool (swift::FullApplySite, swift::SILFunction const&)> const&) + 269
14 swift 0x000000010fdcd634 (anonymous namespace)::TFDeabstraction::doIt() + 404
15 swift 0x000000010fdcd352 (anonymous namespace)::TFDeabstractionPass::run() + 658
16 swift 0x000000010fd31e7a swift::SILPassManager::runModulePass(unsigned int) + 346
17 swift 0x000000010fd327d7 swift::SILPassManager::execute() + 775
18 swift 0x000000010f56f63b swift::SILPassManager::executePassPipelinePlan(swift::SILPassPipelinePlan const&) + 187
19 swift 0x000000010fd3b51c swift::runSILDiagnosticPasses(swift::SILModule&) + 172
20 swift 0x000000010f40063f performCompile(swift::CompilerInstance&, swift::CompilerInvocation&, llvm::ArrayRef<char const*>, int&, swift::FrontendObserver*, swift::UnifiedStatsReporter*) + 12799
21 swift 0x000000010f3fc42e swift::performFrontend(llvm::ArrayRef<char const*>, char const*, void*, swift::FrontendObserver*) + 3310
22 swift 0x000000010f3b2f83 main + 2051
23 libdyld.dylib 0x00007fff52bea115 start + 1
Stack dump:
0. Program arguments: /Library/Developer/Toolchains/swift-tensorflow-DEVELOPMENT-2018-04-26-a.xctoolchain/usr/bin/swift -frontend -c -primary-file /Users/wangtz/Projects/swift/Example/TensotFlowSwiftTest/TensotFlowTest/main.swift -target x86_64-apple-macosx10.13 -enable-objc-interop -sdk /Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX10.13.sdk -I /Users/wangtz/Library/Developer/Xcode/DerivedData/TensotFlowTest-fuvaitqezlugrqgwukmtlrcfypbx/Build/Products/Debug -F /Users/wangtz/Library/Developer/Xcode/DerivedData/TensotFlowTest-fuvaitqezlugrqgwukmtlrcfypbx/Build/Products/Debug -enable-testing -gnone -module-cache-path /Users/wangtz/Library/Developer/Xcode/DerivedData/ModuleCache.noindex -swift-version 4 -O -D DEBUG -serialize-debugging-options -Xcc -I/Users/wangtz/Library/Developer/Xcode/DerivedData/TensotFlowTest-fuvaitqezlugrqgwukmtlrcfypbx/Build/Intermediates.noindex/TensotFlowTest.build/Debug/TensotFlowTest.build/swift-overrides.hmap -Xcc -iquote -Xcc /Users/wangtz/Library/Developer/Xcode/DerivedData/TensotFlowTest-fuvaitqezlugrqgwukmtlrcfypbx/Build/Intermediates.noindex/TensotFlowTest.build/Debug/TensotFlowTest.build/TensotFlowTest-generated-files.hmap -Xcc -I/Users/wangtz/Library/Developer/Xcode/DerivedData/TensotFlowTest-fuvaitqezlugrqgwukmtlrcfypbx/Build/Intermediates.noindex/TensotFlowTest.build/Debug/TensotFlowTest.build/TensotFlowTest-own-target-headers.hmap -Xcc -I/Users/wangtz/Library/Developer/Xcode/DerivedData/TensotFlowTest-fuvaitqezlugrqgwukmtlrcfypbx/Build/Intermediates.noindex/TensotFlowTest.build/Debug/TensotFlowTest.build/TensotFlowTest-all-target-headers.hmap -Xcc -iquote -Xcc /Users/wangtz/Library/Developer/Xcode/DerivedData/TensotFlowTest-fuvaitqezlugrqgwukmtlrcfypbx/Build/Intermediates.noindex/TensotFlowTest.build/Debug/TensotFlowTest.build/TensotFlowTest-project-headers.hmap -Xcc -I/Users/wangtz/Library/Developer/Xcode/DerivedData/TensotFlowTest-fuvaitqezlugrqgwukmtlrcfypbx/Build/Products/Debug/include -Xcc -I/Users/wangtz/Library/Developer/Xcode/DerivedData/TensotFlowTest-fuvaitqezlugrqgwukmtlrcfypbx/Build/Intermediates.noindex/TensotFlowTest.build/Debug/TensotFlowTest.build/DerivedSources/x86_64 -Xcc -I/Users/wangtz/Library/Developer/Xcode/DerivedData/TensotFlowTest-fuvaitqezlugrqgwukmtlrcfypbx/Build/Intermediates.noindex/TensotFlowTest.build/Debug/TensotFlowTest.build/DerivedSources -Xcc -DDEBUG=1 -Xcc -working-directory/Users/wangtz/Projects/swift/Example/TensotFlowSwiftTest -emit-module-doc-path /Users/wangtz/Library/Developer/Xcode/DerivedData/TensotFlowTest-fuvaitqezlugrqgwukmtlrcfypbx/Build/Intermediates.noindex/TensotFlowTest.build/Debug/TensotFlowTest.build/Objects-normal/x86_64/mainpartial.swiftdoc -serialize-diagnostics-path /Users/wangtz/Library/Developer/Xcode/DerivedData/TensotFlowTest-fuvaitqezlugrqgwukmtlrcfypbx/Build/Intermediates.noindex/TensotFlowTest.build/Debug/TensotFlowTest.build/Objects-normal/x86_64/main.dia -module-name TensotFlowTest -emit-module-path /Users/wangtz/Library/Developer/Xcode/DerivedData/TensotFlowTest-fuvaitqezlugrqgwukmtlrcfypbx/Build/Intermediates.noindex/TensotFlowTest.build/Debug/TensotFlowTest.build/Objects-normal/x86_64/mainpartial.swiftmodule -emit-dependencies-path /Users/wangtz/Library/Developer/Xcode/DerivedData/TensotFlowTest-fuvaitqezlugrqgwukmtlrcfypbx/Build/Intermediates.noindex/TensotFlowTest.build/Debug/TensotFlowTest.build/Objects-normal/x86_64/main.d -emit-reference-dependencies-path /Users/wangtz/Library/Developer/Xcode/DerivedData/TensotFlowTest-fuvaitqezlugrqgwukmtlrcfypbx/Build/Intermediates.noindex/TensotFlowTest.build/Debug/TensotFlowTest.build/Objects-normal/x86_64/main.swiftdeps -o /Users/wangtz/Library/Developer/Xcode/DerivedData/TensotFlowTest-fuvaitqezlugrqgwukmtlrcfypbx/Build/Intermediates.noindex/TensotFlowTest.build/Debug/TensotFlowTest.build/Objects-normal/x86_64/main.o
Users should not need to import Foundation to use scalar math if they already imported TensorFlow.
➜ ~ swift
Welcome to Swift version 4.2-dev (LLVM 04bdb56f3d, Clang b44dbbdf44). Type :help for assistance.
1> import TensorFlow
2> var w = Tensor(randomNormal: [1, 2])
Assertion failed: (New->getType() == getType() && "replaceAllUses of value with new value of different type!"), function doRAUW, file /Users/rxwei/Development/swift-copybara-clean/llvm/lib/IR/Value.cpp, line 404.
[1] 53570 abort swift
I did not manage to (easily) build swift-format. It would be very helpful to add build instructions to the readme.
It is totally reasonable to use a varargs array of stuff to configure a tfhoistable operation. Unfortunately, right now, the array construction gets expanded by the performance inliner and partitioning can't hoist its construction. Something like this should be able to work:
import TensorFlow
func hoistableArrayArg<T : Numeric>(extra data: (T,T)...) -> Tensor<T> {
return Tensor<T>(handle: T._hoistableClosure {
// Silly example
let sum = data.reduce(0, {$0+$1.0})
return Tensor<T>(sum).handle
})
}
public func testHoistableArrayArg() {
var matrix: Tensor<Float> = [[1, 2, 3], [4, 5, 6]]+1
matrix += hoistableArrayArg(extra: (1,2),(3,4))
_ = (matrix*matrix).array
}
There are cases that nearly work: if you pass an array of integers (not an array of tuples), then the perf inliner promotes it to a static initializer, which is hoistable and it works.
In the immediate future, we can use an autoclosure, we just lose out on varargs syntax, and have to use explicit []
's on the caller side:
func hoistableArrayArg(extra dataFn: @autoclosure () -> [Float]) -> Tensor<Float> {
return Tensor(handle: Float._hoistableClosure {
// Silly example
let sum = dataFn().reduce(0, +)
return Tensor<Float>(sum).handle
})
}
public func testHoistableArrayArg() {
var matrix: Tensor<Float> = [[1, 2, 3], [4, 5, 6]]
matrix += hoistableArrayArg(extra: [1,2,3])
_ = matrix.array
}
Probably my inexperience, but I can't get the linking to work when I use "import Python" in Xcode -- it throws up a lot of undefined symbols. Is there a particular flag required?
From mailing list discussion: https://groups.google.com/a/tensorflow.org/forum/#!topic/swift/0de3S72qMTU
It would be nice to add initializers that bridge Python numpy.ndarray
(as PyValue
) to Swift Array
and Tensor
.
The implementation should use the raw NumPy data pointer via arr.__array_interface__["data"]
for efficiency:
>>> import numpy as np
>>> arr = np.zeros(5)
>>> arr.__array_interface__
{'data': (6245613920, False), 'strides': None, 'descr': [('', '<f8')], 'typestr': '<f8', 'shape': (5,), 'version': 3}
Here is a sample skeleton for a Tensor
bridge:
extension Tensor {
// Failable initializer.
public init?(_ numpyArray: PyValue) {
// 1. Check if input is a NumPy array.
// 2. Check if the array's dtype matches the `Scalar` type
// (e.g. `np.float32` and `Float`).
// 3. Extract info from `__array_interface__` and convert to Tensor.
self.init(shape: ..., scalars: ...)
}
}
Something to consider is how to handle bridging higher dimensional numpy.ndarray
instances to Array
. It makes sense for the bridging initializer to fail when the numpy.ndarray
instance is not 1-D.
4e27ff2 added code for annotating SIL code with TFPartition markings.
One follow-up suggestion from clattner is to use sil printer callback before printing each instruction (SILPrinter.cpp:634):
for (const SILInstruction &I : *BB) {
Ctx.printInstructionCallBack(&I);
if (SILPrintGenericSpecializationInfo) {
if (auto AI = ApplySite::isa(const_cast<SILInstruction *>(&I)))
if (AI.getSpecializationInfo() && AI.getCalleeFunction())
printGenericSpecializationInfo(
PrintState.OS, "call-site", AI.getCalleeFunction()->getName(),
AI.getSpecializationInfo(), AI.getSubstitutions());
}
print(&I);
}
So we can derive a struct from SILPrintContext, and implement [move] kind of annotations by overriding Ctx.printInstructionCallBack. This way we hopefully just need to call “print” on the function with the right printer context defined.
Example code:
/// A print context which records the line numbers where instructions are
/// printed.
struct AnalysisPrintContext : public SILPrintContext {
AnalysisPrintContext(llvm::raw_ostream &OS) : SILPrintContext(OS) {}
~AnalysisPrintContext() override {}
void printInstructionCallBack(const SILInstruction *I) override {
OutStream << "XXXX ";
I->print(OutStream);
}
};
Call site:
if (auto *outs = getTFDumpIntermediateStream()) {
AnalysisPrintContext ctx(*outs);
fn.print(ctx);
}
One challenge is that in addition to annotating instructions, we also want to annotate BB args (code here). So the current callback mechanism on inst level is not sufficient. One option is to extend the SILPrinter callback infra to support annotating BB args.
When swift source file paths contain perfectly normal emojis (or spaces), tensorflow is not happy.
2018-03-26 17:42:01.497195: E tensorflow/core/common_runtime/executor.cc:644] Executor failed to create kernel. Invalid argument: Node 'op.Swift for TensorFlow.playground.16.15': Node name contains invalid characters
[[Node: tfc_func_main.tf_partition = main.tf_partition[_device="/job:localhost/replica:0/task:0/device:CPU:0"]()]]
Fatal error: Node 'op.Swift for TensorFlow.playground.16.15': Node name contains invalid characters
[[Node: tfc_func_main.tf_partition = main.tf_partition[_device="/job:localhost/replica:0/task:0/device:CPU:0"]()]]: file /Users/rxwei/Development/swift-source/swift/stdlib/public/TensorFlow/CompilerRuntime.swift, line 592
This should be fixed in TFLowerGraph.cpp - the place where we create node names should call into an "escape the name" function to handle this.
I tried formatting the project I'm working on with swift-format
(using default configuration with 4-space indentation) and noticed that the following code:
return Presentation(editProduct, style: .modalOrFlip)
.onValue {
$0.map { self.libraryStore.edit(.add(.product($0))) }
}
.onPresent {
isEditingLibrary.value = true
}
is formatted as
return Presentation(editProduct, style: .modalOrFlip)
.onValue
{
$0.map { self.libraryStore.edit(.add(.product($0))) }
}
.onPresent
{
isEditingLibrary.value = true
}
As I understand, this is done according to the rule 5 in the Line Wrapping section of the Swift Style Guide. In my opinion, however, newlines before opening braces are unnecessary in this case and should not be added.
Hi, I'm enjoying swift-format. This is great!
I have a silly question: is it possible to configure line break-ness in this example:
convenience public init(_ name: String) {
self
.init(where: { x in
x.text == name
})
I'd like to put self
and .init(
in the same line, and wonder it is possible to do it with existing configuration.
I like swift-format's other line breaking suggestions, so this is the only case I want to update.
Thanks!
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.