Giter VIP home page Giter VIP logo

Comments (10)

YuAo avatar YuAo commented on May 26, 2024

Can you be more specific? Are you asking how to add UI to the recorder view or how to record some UI elements to the video?

from videoio.

MelnykovDenys avatar MelnykovDenys commented on May 26, 2024

Yes, add an overlay view (UI element) to the recorder view and get at the end video-recording from the camera and overlay elements on top

from videoio.

YuAo avatar YuAo commented on May 26, 2024

If the UI / Video clock sync is not your concern, you can use this utility to snapshot your view's content into a CVPixelBuffer.

You can then use this CVPixelBuffer to create a MTIImage and use the MTIImage as an overlay of the video image. You can refer to the "CameraFilterView" and "MultilayerCompositingFilterView" examples here https://github.com/MetalPetal/MetalPetal

/// Snapshot a view using a CGContext that is backed by a CVPixelBuffer from a CVPixelBufferPool.
class ViewSnapshoter {
    private var pixelBufferPool: MTICVPixelBufferPool?
    
    enum Error: String, LocalizedError {
        case cannotCreateCGContext
    }
    
    func snapshot(_ view: UIView, afterScreenUpdates: Bool, renderScale: CGFloat = 2) throws -> CVPixelBuffer {
        let renderWidth = Int(view.frame.width * renderScale)
        let renderHeight = Int(view.frame.height * renderScale)
        let pool: MTICVPixelBufferPool
        if let pixelBufferPool = pixelBufferPool {
            if pixelBufferPool.pixelBufferWidth == renderWidth && pixelBufferPool.pixelBufferHeight == renderHeight {
                pool = pixelBufferPool
            } else {
                pool = try MTICVPixelBufferPool(pixelBufferWidth: renderWidth, pixelBufferHeight: renderHeight, pixelFormatType: kCVPixelFormatType_32BGRA, minimumBufferCount: 16)
                self.pixelBufferPool = pool
            }
        } else {
            pool = try MTICVPixelBufferPool(pixelBufferWidth: renderWidth, pixelBufferHeight: renderHeight, pixelFormatType: kCVPixelFormatType_32BGRA, minimumBufferCount: 16)
            self.pixelBufferPool = pool
        }
        let pixelBuffer = try pool.makePixelBuffer(allocationThreshold: 16)
        CVPixelBufferLockBaseAddress(pixelBuffer, [])
        defer {
            CVPixelBufferUnlockBaseAddress(pixelBuffer, [])
        }
        guard let cgContext = CGContext(data: CVPixelBufferGetBaseAddress(pixelBuffer), width: renderWidth, height: renderHeight, bitsPerComponent: 8, bytesPerRow: CVPixelBufferGetBytesPerRow(pixelBuffer), space: CGColorSpaceCreateDeviceRGB(), bitmapInfo: CGBitmapInfo.byteOrder32Little.rawValue | CGImageAlphaInfo.premultipliedFirst.rawValue /* 32-bit BGRA */) else {
            throw Error.cannotCreateCGContext
        }
        
        // Apply transform for UIKit coordinate system.
        cgContext.concatenate(CGAffineTransform(translationX: 0, y: CGFloat(renderHeight)))
        cgContext.concatenate(CGAffineTransform(scaleX: renderScale, y: -renderScale))
        
        UIGraphicsPushContext(cgContext)
        view.drawHierarchy(in: view.bounds, afterScreenUpdates: afterScreenUpdates)
        UIGraphicsPopContext()
        
        return pixelBuffer
    }
}

from videoio.

MelnykovDenys avatar MelnykovDenys commented on May 26, 2024

thanks for the answer!
I've tried this method but I'm getting an error: "Unexpected asset record status". when I try to save

I will try your method

import UIKit
import VideoIO
import MetalPetal
import AVFoundation
import SnapKit
import RxSwift
import RxCocoa

class MetalPetalViewController: UIViewController {
    
    private let filter = MTIBlendFilter(blendMode: .overlay)
    private var recorder: MovieRecorder?
    private var camera: Camera!

    private let queue: DispatchQueue = DispatchQueue(label: "org.metalpetal.capture")
    private var isRecording = BehaviorRelay<Bool>(value: false)
    private let disposeBag = DisposeBag()
    
    private var cameraOutputView = MTIImageView()
    private let recordButton = UIButton()
    private let overlayView = UIView()
    private let timerLabel = UILabel()
    
    override func viewDidLoad() {
        super.viewDidLoad()
        setupLayout()
        setupCamera()
        
        recordButton.rx.tap
            .bind {
                self.isRecording.value ? self.stopRecording() : self.startRecording()
            }.disposed(by: disposeBag)
        
        isRecording.map { $0 ? UIColor.blue : UIColor.red }
            .bind(to: recordButton.rx.backgroundColor)
            .disposed(by: disposeBag)
        
        let countDown = 100
        Observable<Int>.timer(.seconds(0), period: .seconds(1), scheduler: MainScheduler.instance)
            .take(countDown + 1)
            .subscribe(onNext: { timePassed in
                self.timerLabel.text = "\(timePassed)"
            }, onCompleted: {
                print("count down complete")
            }).disposed(by: disposeBag)
    }
    
    override func viewDidAppear(_ animated: Bool) {
        super.viewDidAppear(animated)
        startRunningCaptureSession()
    }
    
    override func viewDidDisappear(_ animated: Bool) {
        super.viewDidDisappear(animated)
        stopRunningCaptureSession()
    }
    
    private func setupLayout() {
        view.addSubview(cameraOutputView)
        cameraOutputView.snp.makeConstraints {
            $0.edges.equalToSuperview()
        }
        
        overlayView.backgroundColor = .clear
        view.addSubview(overlayView)
        overlayView.snp.makeConstraints {
            $0.edges.equalToSuperview()
        }
        
        timerLabel.textColor = .white
        timerLabel.font = .systemFont(ofSize: 20)
        overlayView.addSubview(timerLabel)
        timerLabel.snp.makeConstraints {
            $0.center.equalToSuperview()
        }
        
        recordButton.layer.cornerRadius = 25
        view.addSubview(recordButton)
        recordButton.snp.makeConstraints {
            $0.size.equalTo(50)
            $0.bottom.equalToSuperview().inset(60)
            $0.centerX.equalToSuperview()
        }
    }
    
    private func setupCamera() {
        self.camera = Camera(captureSessionPreset: .hd1920x1080, configurator: .portraitFrontMirroredVideoOutput)
        try? camera.enableVideoDataOutput(on: queue, delegate: self)
        try? camera.enableAudioDataOutput(on: queue, delegate: self)
        
        camera.videoDataOutput?.videoSettings = [kCVPixelBufferPixelFormatTypeKey as String: kCVPixelFormatType_420YpCbCr8BiPlanarFullRange]
    }
    
    private func startRunningCaptureSession() {
        queue.async {
            self.camera.startRunningCaptureSession()
        }
    }
    
    private func stopRunningCaptureSession() {
        queue.async {
            self.camera.stopRunningCaptureSession()
        }
    }
    
    private func startRecording() {
        let sessionID = UUID()
        let url = FileManager.default.temporaryDirectory.appendingPathComponent("\(sessionID.uuidString).mp4")
        let hasAudio = self.camera.audioDataOutput != nil
        do {
            let recorder = try MovieRecorder(url: url, configuration: MovieRecorder.Configuration(hasAudio: hasAudio))
            self.isRecording.accept(true)
            queue.async {
                self.recorder = recorder
            }
        } catch {
            handleError(error)
        }
    }
    
    private func stopRecording() {
        if let recorder = recorder {
            recorder.stopRecording(completion: { error in
                self.isRecording.accept(false)
                if let error = error {
                    self.handleError(error)
                } else {
                    self.handleFinishRecording(videoURL: recorder.url)
                }
            })
            queue.async {
                self.recorder = nil
            }
        }
    }
    
    private func handleFinishRecording(videoURL: URL) {
        navigationController?.pushViewController(VideoPlayerVC(url: videoURL), animated: true)
    }
    
    private func handleError(_ error: Error) {
        let alert = UIAlertController(title: "Error",
                                      message: error.localizedDescription,
                                      preferredStyle: .alert)
        alert.addAction(.init(title: "OK", style: .cancel))
        present(alert, animated: true)
    }
}

extension MetalPetalViewController: AVCaptureVideoDataOutputSampleBufferDelegate, AVCaptureAudioDataOutputSampleBufferDelegate {
    func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
        guard let formatDescription = sampleBuffer.formatDescription else {
            return
        }
        switch formatDescription.mediaType {
        case .audio:
            do {
                try self.recorder?.appendSampleBuffer(sampleBuffer)
            } catch {
                print(error)
            }
        case .video:
            guard let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else { return }
            filter.inputBackgroundImage = MTIImage(cvPixelBuffer: pixelBuffer,
                                                   alphaType: .nonPremultiplied)
            let inputImage = overlayView.asImage()
            filter.inputImage = .init(image: inputImage)
            DispatchQueue.main.async {
                self.cameraOutputView.image = self.filter.outputImage
            }
        default:
            break
        }
    }
}

extension UIView {
    func asImage() -> UIImage {
        let renderer = UIGraphicsImageRenderer(bounds: bounds)
        return renderer.image { rendererContext in
            layer.render(in: rendererContext.cgContext)
        }
    }
}

from videoio.

YuAo avatar YuAo commented on May 26, 2024

I can't see you append any video buffer to the recorder. The internal writer cannot start without receiving video buffers so "Unexpected status".

Also I don't think you want to use the Overlay blend mode.

from videoio.

MelnykovDenys avatar MelnykovDenys commented on May 26, 2024

Yeah, you’re right, I missed adding a buffer to the recorder, my bad.
I'm so sorry to bother you but I’m rewriting the project from OpenGL to Metal and beginner in this topic.
I tried to write without the filter and understood that I need to merge 2 buffers, but I didn't understand how to do it

  func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
        guard let formatDescription = sampleBuffer.formatDescription else {
            return
        }
        switch formatDescription.mediaType {
        case .audio:
            do {
                try self.recorder?.appendSampleBuffer(sampleBuffer)
            } catch {
                handleError(error)
            }
        case .video:
            guard let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else { return }
            
            DispatchQueue.main.async {
                self.cameraOutputView.image = MTIImage(cvPixelBuffer: pixelBuffer,
                                                       alphaType: .alphaIsOne)
                
                let overlayPixelBuffer = try! self.viewSnapshoter.snapshot(
                    self.overlayView,
                    afterScreenUpdates: true,
                    renderScale: 1
                )
                
                //need to merge?
                
                try? self.recorder?.appendSampleBuffer(
                    SampleBufferUtilities.makeSampleBufferByReplacingImageBuffer(of: sampleBuffer, with: pixelBuffer)!
                )
            }
        default:
            break
        }
    }

from videoio.

YuAo avatar YuAo commented on May 26, 2024

You need to use a filter to compose the two image together, you can use a MultilayerCompositingFilter or a normal blend filter. Make sure you read the "CameraFilterView" example.

from videoio.

wesselpeder avatar wesselpeder commented on May 26, 2024

@MelnykovDenys are you able to share how you got it to work?

from videoio.

MelnykovDenys avatar MelnykovDenys commented on May 26, 2024

@MelnykovDenys are you able to share how you got it to work?

unfortunately not, because it doesn’t work

from videoio.

YuAo avatar YuAo commented on May 26, 2024

See MetalPetal/MetalPetal#320, MetalPetal/MetalPetal#89

from videoio.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.