heestand-xyz / pixelkit Goto Github PK
View Code? Open in Web Editor NEWLive Graphics in Swift & Metal
License: MIT License
Live Graphics in Swift & Metal
License: MIT License
Hi all,
We are trying to use the CameraPIX together with a MOV that has an alpha channel.
` let camera = CameraPIX()
camera.camera = .front
let overlay = VideoPIX()
overlay.load(fileNamed: "FreeSmokeYoutube", withExtension: "mov") { done in
let blend = BlendPIX()
blend.blendMode = .addWithAlpha
blend.placement = .aspectFill
blend.extend = .mirror
blend.inputA = camera
blend.inputB = overlay
self.record = RecordPIX()
self.record?.input = blend
try? self.record?.startRec(name: "tempvid")
}`
The outcome though, it's really low frame rate. This is basically using the default configuration so, what could it be?
Thanks
.gitmodules
contains:
https://github.com/heestand-xyz/MetlShaders.git
but it looks like it should be:
https://github.com/heestand-xyz/MetalShaders.git
('Metal' instead of 'Metl')
Hi Hexagons,
Trying to install the package and getting the following error. Can you please advise.
because PixelKit >=1.0.13 contains incompatible tools version and root depends on PixelKit 1.1.2..<1.2.0, version solving failed.
I'd love PixelKit,but,I find that ,Unable to find a pod with name, author, summary, or description matching PixelKit
how it's possible to export Green Screen to asset or video, I use the example here
Green Screen in Swift & PixelKit
http://blog.hexagons.se/blog/green-screen-in-swift-pixelkit/
I'm using the following code to create an initial alpha layer called threshold
and then multiplying it with border color. The code below is only run once.
let fullscreenBackground = ColorPIX(at: .fullscreen)
fullscreenBackground.color = .clear
let blended: BlendPIX = fullscreenBackground & imagePix
let mask: ReorderPIX = ReorderPIX()
mask.inputA = blended
mask.inputB = blended
mask.redChannel = .alpha
mask.greenChannel = .alpha
mask.blueChannel = .alpha
mask.alphaChannel = .one
threshold = blended._blur(0.1)._threshold(0.1)._lumaToAlpha()
let borderColor = LiveColor(color)
let colorEdge = threshold! * borderColor
let final: PIX = colorEdge & imagePix
final.delegate = self
Later by changing the color I'm calling this following method on a debounce of 2 seconds:
guard let threshold = threshold else { return }
let borderColor = LiveColor(color)
let colorEdge = threshold * borderColor
let final: PIX = colorEdge & imagePix
final.delegate = self
M_1_PI_F is used. Tho this is 1.0 / PI. M_PI_F needs to be used.
I'm sorry about this issue and sorry about those things,
but when I export the video using it export only the video without the audio channel,
Note if I did that flag to true,
RecordPIX.recordAudio = true
it will record the microphone audio along with video audio,
is it possible to export the video ad audio without enabling the microphone?
Hi,
first off, congrats for the great work done.
I want to apply flarePIX into Scenekit via ScenePIX , but ScenePIX is undocumented.
I see
public func setup(cameraNode: SCNNode, scene: SCNScene, sceneView: SCNView)
what is the correct way to apply scenePIX to Scenekit?
Thanks.
Hi,
trying to setup this on scenekit , I think I am missing one or two steps for the init
Crash on the below in ScenePIX.swift , line 91
renderer.render(atTime: 0, viewport: viewport, commandBuffer: commandBuffer, passDescriptor: renderPassDescriptor)
<--- renderer seems nil <--- Thread 1: Fatal error: Unexpectedly found nil while implicitly unwrapping an Optional value
Log:
RenderKit ready to render.
PixelKit #00 WARNING [1] ScenePIX Render >>> Metal View res not set
PixelKit #1 INFO [1] ScenePIX Res >>> Applied: custom(w: 1500, h: 1000) [1500x1000]
Fatal error: Unexpectedly found nil while implicitly unwrapping an Optional value: file PixelKit/ScenePIX.swift, line 91
Code
DispatchQueue.global(qos: .background).async {
let scenePix = ScenePIX(at: .custom(w: 1500, h: 1000))
scenePix.setup(cameraNode: self.sceneView.currentPointOfView()!, scene: self.sceneView.currentScene()!, sceneView: self.sceneView)
let res: Resolution = .custom(w: 1500, h: 1000)
let circle = CirclePIX(at: res)
circle.radius = 0.45
circle.bgColor = .clear
let finalPix: PIX = scenePix * circle
finalPix.view.frame = self.view.bounds
self.view.addSubview(finalPix.view)
}
May I have an hint were to look please as there are no readme or examples around on scenePIX?
Thank you very much.
on iOS 16 I get
CAMetalLayer ignoring invalid setDrawableSize width=0.000000 height=0.000000
or
2023-07-10 15:15:22.745864-0400 app[41287:14957907] [Graphics] Invalid size provided to UIGraphicsBeginImageContext(): size={353, 0}, scale=3.000000
2023-07-10 15:15:22.746013-0400 app[41287:14957907] [Unknown process name] CGContextGetType: invalid context 0x0. If you want to see the backtrace, please set CG_CONTEXT_SHOW_BACKTRACE environmental variable.
-----> (1059.0, 0.0) (128.0, 128.0) (353.0, 0.0)
code:
struct Modified<V: View>: View {
let child: V
var body: some View {
let view = ViewPIX { child }
let pix = view.pixEdge()
return PixelView(pix: pix)
}
}
Couple of files in this project do not conform to NodeMetal
protocol.
Specifically metalFileName
property introduced here is missing for multiple files ex. MetalMergerEffectPIX
.
I've added PixelKit to my Podfile and installed it. Xcode does pick it up and I can import it and use the UIKit related part without problems. However, I can't use SwiftUI related types as those don't get resolved by Xcode.
On iOS 16, all UIViewRepresentables have to be structs.
Do you have an example of using video recording?
I was hacking together a demo where I had to do portrait segmentation (Zoom like background removal) without chroma key.
I decided to use Deeplabv3 and found this project: https://github.com/jsharp83/MetalCamera which has all the parts properly setup. With a few minor changes I was able to macOS compatible as well.
This would make a great addition to PixelKit.
My use case is: remove the background from realtime video and merge it with a custom image/video.
Relevant File:
https://github.com/jsharp83/MetalCamera/blob/master/MetalCamera/Classes/CoreML/CoreMLClassifierHandler.swift
What I am trying to achieve is the following:
Use the CameraPix and remove background (green screen), blend it with a custom background, get IOSurface backed CVPixelBuffer from the resultant PIX and publish them to an external process.
Hi heestand,
Thanks for the fix regarding the swift version, it still won't build, here are the errors, I have linked to the source files:
Line 146 in 12d5c2c
And this that still tries to compile even with the safeguards (no idea how it's possible, I will thoroughly check my xcode project)
Sorry it's very ugly 😅, do you think it will be possible to remediate those issue to be compatible with an older SDK than Big Sur?
Thanks
Where is the link?
Hi, is it planned to add the function of generating text in the form of a texture using Metal API?. What methods do you know for generating text using metal? I would be very happy if you gave an example. My goal is to overlay the effect of fire on the text. thanks for the answer
This is a very good framework, I want to save the processed photos to the album, but can't find the corresponding method, I see the code is directly loaded View:
let image = ImagePIX()
image.image = self.myImage
let edge = EdgePIX()
edge.input = image
edge.strength = 3
edge.view.frame = myImageView.bounds
self.view.addSubview(edge.view)
Can't get image information of edge.view
Hi,
Sorry I was out of the swift/spm loop for two years, as I was still on Mojave, and a lot has changed, querying google gives various options none seems to work.
First of I'm under Catalina and using XCode 12.4 (AFAIK the last version compatible with Catalina).
My project is a basic SwiftUI macOS app.
In XCode I went to File > Swift Packages > Add Package Dependency...
and search for PixelKit.
In the version selection panel I tried a few things:
Both returned the following:
Very interesting project.
Thanks
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.