HandVector uses Cosine Similarity Algorithm to calculate the similarity of hand gestures in visionOS, and with a macOS tool to test hand tracking in visionOS simulator.
Requirements • Usage • Installation • Contribution • Contact • License
- visionOS 1.0+
- Xcode 15.2+
- Swift 5.9+
Your can run demo in package to see how to use it, and also can try an Vision Pro App: FingerEmoji in App Store to see how it works.
HandVector
allows you to track your hands, and calculate the similarity between your current hand to another recorded hand gesture:
import HandVector
//load recorded hand gesture from json file
model.handEmojiDict = HandEmojiParameter.generateParametersDict(fileName: "HandEmojiTotalJson")!
guard let okVector = model.handEmojiDict["👌"]?.convertToHandVectorMatcher(), let leftOKVector = okVector.left else { return }
//update current handTracking from HandTrackingProvider
for await update in handTracking.anchorUpdates {
switch update.event {
case .added, .updated:
let anchor = update.anchor
guard anchor.isTracked else { continue }
await latestHandTracking.updateHand(from: anchor)
case .removed:
...
}
}
//calculate the similarity
let leftScore = model.latestHandTracking.leftHandVector?.similarity(to: leftOKVector) ?? 0
model.leftScore = Int(abs(leftScore) * 100)
let rightScore = model.latestHandTracking.rightHandVector?.similarity(to: leftOKVector) ?? 0
model.rightScore = Int(abs(rightScore) * 100)
the score should be in [-1.0,1.0]
, 1.0
means fully matched and both are left or right hands, -1.0
means fully matched but one is left hand, another is right hand, and 0
means not matched.
The test method ofHandVector
is inspired by VisionOS Simulator hands, it allow you to test hand tracking on visionOS simulator:
It uses 2 things:
- A macOS helper app, with a bonjour service
- A Swift class for your VisionOS project which connects to the bonjour service (already in this package, and already turn JSON data to hand gestures)
The helper app uses Google MediaPipes for 3D hand tracking. This is a very basic setup - it uses a WKWebView to run the Google sample code, and that passed the hand data as JSON into native Swift.
The Swift code then spits out the JSON over a Bonjour service.
If hand tracking can't start for a long time(Start button still can't be pressed), please check your network to google MediaPipes.
To go further, take a look at the documentation and the demo project.
Note: All contributions are welcome
To integrate using Apple's Swift package manager, without Xcode integration, add the following as a dependency to your Package.swift
:
.package(url: "https://github.com/XanderXu/HandVector.git", .upToNextMajor(from: "0.2.0"))
Download the project and copy the HandVector
folder into your project to use it.
Contributions are welcomed and encouraged ♡.
Xander: API 搬运工
HandVector is released under an MIT license. See LICENSE for more information.