Comments (10)
Can you be more specific? Are you asking how to add UI to the recorder view or how to record some UI elements to the video?
from videoio.
Yes, add an overlay view (UI element) to the recorder view and get at the end video-recording from the camera and overlay elements on top
from videoio.
If the UI / Video clock sync is not your concern, you can use this utility to snapshot your view's content into a CVPixelBuffer
.
You can then use this CVPixelBuffer
to create a MTIImage
and use the MTIImage
as an overlay of the video image. You can refer to the "CameraFilterView" and "MultilayerCompositingFilterView" examples here https://github.com/MetalPetal/MetalPetal
/// Snapshot a view using a CGContext that is backed by a CVPixelBuffer from a CVPixelBufferPool.
class ViewSnapshoter {
private var pixelBufferPool: MTICVPixelBufferPool?
enum Error: String, LocalizedError {
case cannotCreateCGContext
}
func snapshot(_ view: UIView, afterScreenUpdates: Bool, renderScale: CGFloat = 2) throws -> CVPixelBuffer {
let renderWidth = Int(view.frame.width * renderScale)
let renderHeight = Int(view.frame.height * renderScale)
let pool: MTICVPixelBufferPool
if let pixelBufferPool = pixelBufferPool {
if pixelBufferPool.pixelBufferWidth == renderWidth && pixelBufferPool.pixelBufferHeight == renderHeight {
pool = pixelBufferPool
} else {
pool = try MTICVPixelBufferPool(pixelBufferWidth: renderWidth, pixelBufferHeight: renderHeight, pixelFormatType: kCVPixelFormatType_32BGRA, minimumBufferCount: 16)
self.pixelBufferPool = pool
}
} else {
pool = try MTICVPixelBufferPool(pixelBufferWidth: renderWidth, pixelBufferHeight: renderHeight, pixelFormatType: kCVPixelFormatType_32BGRA, minimumBufferCount: 16)
self.pixelBufferPool = pool
}
let pixelBuffer = try pool.makePixelBuffer(allocationThreshold: 16)
CVPixelBufferLockBaseAddress(pixelBuffer, [])
defer {
CVPixelBufferUnlockBaseAddress(pixelBuffer, [])
}
guard let cgContext = CGContext(data: CVPixelBufferGetBaseAddress(pixelBuffer), width: renderWidth, height: renderHeight, bitsPerComponent: 8, bytesPerRow: CVPixelBufferGetBytesPerRow(pixelBuffer), space: CGColorSpaceCreateDeviceRGB(), bitmapInfo: CGBitmapInfo.byteOrder32Little.rawValue | CGImageAlphaInfo.premultipliedFirst.rawValue /* 32-bit BGRA */) else {
throw Error.cannotCreateCGContext
}
// Apply transform for UIKit coordinate system.
cgContext.concatenate(CGAffineTransform(translationX: 0, y: CGFloat(renderHeight)))
cgContext.concatenate(CGAffineTransform(scaleX: renderScale, y: -renderScale))
UIGraphicsPushContext(cgContext)
view.drawHierarchy(in: view.bounds, afterScreenUpdates: afterScreenUpdates)
UIGraphicsPopContext()
return pixelBuffer
}
}
from videoio.
thanks for the answer!
I've tried this method but I'm getting an error: "Unexpected asset record status". when I try to save
I will try your method
import UIKit
import VideoIO
import MetalPetal
import AVFoundation
import SnapKit
import RxSwift
import RxCocoa
class MetalPetalViewController: UIViewController {
private let filter = MTIBlendFilter(blendMode: .overlay)
private var recorder: MovieRecorder?
private var camera: Camera!
private let queue: DispatchQueue = DispatchQueue(label: "org.metalpetal.capture")
private var isRecording = BehaviorRelay<Bool>(value: false)
private let disposeBag = DisposeBag()
private var cameraOutputView = MTIImageView()
private let recordButton = UIButton()
private let overlayView = UIView()
private let timerLabel = UILabel()
override func viewDidLoad() {
super.viewDidLoad()
setupLayout()
setupCamera()
recordButton.rx.tap
.bind {
self.isRecording.value ? self.stopRecording() : self.startRecording()
}.disposed(by: disposeBag)
isRecording.map { $0 ? UIColor.blue : UIColor.red }
.bind(to: recordButton.rx.backgroundColor)
.disposed(by: disposeBag)
let countDown = 100
Observable<Int>.timer(.seconds(0), period: .seconds(1), scheduler: MainScheduler.instance)
.take(countDown + 1)
.subscribe(onNext: { timePassed in
self.timerLabel.text = "\(timePassed)"
}, onCompleted: {
print("count down complete")
}).disposed(by: disposeBag)
}
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
startRunningCaptureSession()
}
override func viewDidDisappear(_ animated: Bool) {
super.viewDidDisappear(animated)
stopRunningCaptureSession()
}
private func setupLayout() {
view.addSubview(cameraOutputView)
cameraOutputView.snp.makeConstraints {
$0.edges.equalToSuperview()
}
overlayView.backgroundColor = .clear
view.addSubview(overlayView)
overlayView.snp.makeConstraints {
$0.edges.equalToSuperview()
}
timerLabel.textColor = .white
timerLabel.font = .systemFont(ofSize: 20)
overlayView.addSubview(timerLabel)
timerLabel.snp.makeConstraints {
$0.center.equalToSuperview()
}
recordButton.layer.cornerRadius = 25
view.addSubview(recordButton)
recordButton.snp.makeConstraints {
$0.size.equalTo(50)
$0.bottom.equalToSuperview().inset(60)
$0.centerX.equalToSuperview()
}
}
private func setupCamera() {
self.camera = Camera(captureSessionPreset: .hd1920x1080, configurator: .portraitFrontMirroredVideoOutput)
try? camera.enableVideoDataOutput(on: queue, delegate: self)
try? camera.enableAudioDataOutput(on: queue, delegate: self)
camera.videoDataOutput?.videoSettings = [kCVPixelBufferPixelFormatTypeKey as String: kCVPixelFormatType_420YpCbCr8BiPlanarFullRange]
}
private func startRunningCaptureSession() {
queue.async {
self.camera.startRunningCaptureSession()
}
}
private func stopRunningCaptureSession() {
queue.async {
self.camera.stopRunningCaptureSession()
}
}
private func startRecording() {
let sessionID = UUID()
let url = FileManager.default.temporaryDirectory.appendingPathComponent("\(sessionID.uuidString).mp4")
let hasAudio = self.camera.audioDataOutput != nil
do {
let recorder = try MovieRecorder(url: url, configuration: MovieRecorder.Configuration(hasAudio: hasAudio))
self.isRecording.accept(true)
queue.async {
self.recorder = recorder
}
} catch {
handleError(error)
}
}
private func stopRecording() {
if let recorder = recorder {
recorder.stopRecording(completion: { error in
self.isRecording.accept(false)
if let error = error {
self.handleError(error)
} else {
self.handleFinishRecording(videoURL: recorder.url)
}
})
queue.async {
self.recorder = nil
}
}
}
private func handleFinishRecording(videoURL: URL) {
navigationController?.pushViewController(VideoPlayerVC(url: videoURL), animated: true)
}
private func handleError(_ error: Error) {
let alert = UIAlertController(title: "Error",
message: error.localizedDescription,
preferredStyle: .alert)
alert.addAction(.init(title: "OK", style: .cancel))
present(alert, animated: true)
}
}
extension MetalPetalViewController: AVCaptureVideoDataOutputSampleBufferDelegate, AVCaptureAudioDataOutputSampleBufferDelegate {
func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
guard let formatDescription = sampleBuffer.formatDescription else {
return
}
switch formatDescription.mediaType {
case .audio:
do {
try self.recorder?.appendSampleBuffer(sampleBuffer)
} catch {
print(error)
}
case .video:
guard let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else { return }
filter.inputBackgroundImage = MTIImage(cvPixelBuffer: pixelBuffer,
alphaType: .nonPremultiplied)
let inputImage = overlayView.asImage()
filter.inputImage = .init(image: inputImage)
DispatchQueue.main.async {
self.cameraOutputView.image = self.filter.outputImage
}
default:
break
}
}
}
extension UIView {
func asImage() -> UIImage {
let renderer = UIGraphicsImageRenderer(bounds: bounds)
return renderer.image { rendererContext in
layer.render(in: rendererContext.cgContext)
}
}
}
from videoio.
I can't see you append any video buffer to the recorder. The internal writer cannot start without receiving video buffers so "Unexpected status".
Also I don't think you want to use the Overlay blend mode.
from videoio.
Yeah, you’re right, I missed adding a buffer to the recorder, my bad.
I'm so sorry to bother you but I’m rewriting the project from OpenGL to Metal and beginner in this topic.
I tried to write without the filter and understood that I need to merge 2 buffers, but I didn't understand how to do it
func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
guard let formatDescription = sampleBuffer.formatDescription else {
return
}
switch formatDescription.mediaType {
case .audio:
do {
try self.recorder?.appendSampleBuffer(sampleBuffer)
} catch {
handleError(error)
}
case .video:
guard let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else { return }
DispatchQueue.main.async {
self.cameraOutputView.image = MTIImage(cvPixelBuffer: pixelBuffer,
alphaType: .alphaIsOne)
let overlayPixelBuffer = try! self.viewSnapshoter.snapshot(
self.overlayView,
afterScreenUpdates: true,
renderScale: 1
)
//need to merge?
try? self.recorder?.appendSampleBuffer(
SampleBufferUtilities.makeSampleBufferByReplacingImageBuffer(of: sampleBuffer, with: pixelBuffer)!
)
}
default:
break
}
}
from videoio.
You need to use a filter to compose the two image together, you can use a MultilayerCompositingFilter or a normal blend filter. Make sure you read the "CameraFilterView" example.
from videoio.
@MelnykovDenys are you able to share how you got it to work?
from videoio.
@MelnykovDenys are you able to share how you got it to work?
unfortunately not, because it doesn’t work
from videoio.
See MetalPetal/MetalPetal#320, MetalPetal/MetalPetal#89
from videoio.
Related Issues (20)
- Domain: AVFoundationErrorDomain Code: -11819 NSLocalizedDescription: Cannot Complete Action
- Synchronized Video, Depth, and Audio Data HOT 3
- Segment Duration Zero HOT 1
- CIImage from MTIImage
- No output video buffers from AVPlayer using PlayerVideoOutput HOT 2
- Wrong audioTracks being added in audioOutput inside AssetExportSession HOT 1
- Unable to set Frame rate to 60fps for 'vide'/'x420' 3840x2160 HOT 4
- Toggle front / back camera during capture session? HOT 2
- Focus hunting on .builtInWideAngleCamera on iPhone 12 pro HOT 2
- How to switch microphone inputs to stereo? HOT 2
- Each time video recording is initiated, there is an error `Video inputs: not ready for media data` HOT 10
- Cannot Encode Media on AssetExportSession HOT 3
- Video overlay on top of live feed HOT 30
- Never ready for audio HOT 4
- Frames no longer being appended HOT 7
- use AVAssetWriterInputPixelBufferAdaptor to fix errorCode -11800 ? HOT 5
- How to use multiple video assets in this code example.
- Xcode 15 Beta compile issues (macOS)
- Video artifacts and reduced video size after export HOT 4
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from videoio.