remirobert / cameraengine Goto Github PK
View Code? Open in Web Editor NEW:monkey::camera: Camera engine for iOS, written in Swift, above AVFoundation. :monkey:
Home Page: https://github.com/remirobert/CameraEngine
License: MIT License
:monkey::camera: Camera engine for iOS, written in Swift, above AVFoundation. :monkey:
Home Page: https://github.com/remirobert/CameraEngine
License: MIT License
CameraEngine is an awesome library, I'm loving it and thanks a lot for making my life easier. It would be fantastic if you could add tap to focus function though, any chance this could be done?
Sometimes, the preview will be black once I call engine.startSession()
so I have to restart my app and then the preview will load. I would say 1/10 times this happens.
my app is just on Portrait, but when I turn the divice, the camera turns, but the rest of the layout dont, so it appears rotated
Hi!
I am new to iOS development and am trying to add this project to Xcode manually. Whenever I add it as a library I cannot import the module CameraEngine, as it is not found?
Please explain how I can do this, Thanks!
I keep getting this error
I have downloaded this example, and it builds fine but does not run on the phone, doesn't appear to copy to the phone at all, am I missing something?
I am trying to get the flash to switch on but no luck so far. I tried both in my app and also installed the example app on the device and tried running it but the flash does not seem to switch on in configureTorch and configureFlash. Any idea as to how should i debug it ? When i print the currentDevice.flashAvailable i t is set to true. And I am getting inside the try-catch block and not even getting the lock configuration error. In my regular camera app I have the flash setting working. Any help would be appreciated as I am trying to make a prototype and i really need the flash feature to be working for my camera. Just FYI: For photo mode the flash works perfectly fine. It is when i switch the camera to video.
I have following code in ViewDidLoad() and it causes app crash with error 'Can't add a nil AVCaptureInput' :
override func viewDidLoad() {
CameraEngine.startup()
let preview: AVCaptureVideoPreviewLayer = CameraEngine.getPreviewLayer()
preview.frame = self.view.bounds
self.view.layer.addSublayer(preview)
self.view.addSubview(takePhotoButton)
cameraFlash.setImage(noFlash, forState: .Normal)
CameraEngine.shareInstance().flash = false
self.view.addSubview(cameraFlash) // flash light button
photoLibrary.setImage(album, forState: .Normal)
self.view.addSubview(photoLibrary) // open photo album button
}
It fails to build saying that the value of type 'CameraEngine' has no member 'cameraZoomFactor'.
This is when building app for "In House" and using cocoapod
Failed to verify code signature of /private/var/installd/Library/Caches/com.apple.mobile.installd.staging/....../Frameworks/CameraEngine.framework
I'm not too familiar with cocoapod but it seems like the info.plist file is the culprit? When exporting the .ipa file, CameraEngine had a "binary" / "executable" icon instead of the framework icon from usual pods.
I'm trying out the 3.0 swift branch, however there seem to be a lot of errors still
The app crashes when I try to take a photo with front camera. This was working fine before updating to Swift 3 syntax.
Here is some debug code I got:
invalid mode 'kCFRunLoopCommonModes' provided to CFRunLoopRunSpecific - break on _CFRunLoopError_RunCalledWithInvalidMode to debug. This message will only appear once per execution.
[AVCapturePhotoOutput capturePhotoWithSettings:delegate:] flashMode must be set to a value present in the supportedFlashModes array'
when u switch camera, make sure this:
You must remove and recreate the AVCaptureVideoDataOutput as well.
When taking a sequence of photos with the camera example, I get the following error:
-[AVCapturePhotoOutput capturePhotoWithSettings:delegate:] Settings may not be re-used'
Any idea how to create a unique value for the settings?
Does branch 'swift-3.0' compile with XCode 8/Swift 3? I cannot compile. Am I doing something wrong?
Getting tons of errors trying to use it in Swift 3.0
Ever since updating to XCode8 and Swift3, I've been getting a crash in CameraEngineOutput.swift in the func configureCaptureOutput
session.addOutput(self.captureVideoOutput)
I've updated my podfile to point to the 1.0 branch and the app builds fine, but it crashes on the splash screen giving the error:
Thread 4: EXC_BAD_ACCESS (code=1, address=0x50)
(Note: Thread 4 is the capturesession thread)
Has anyone else been getting this? I'm at a loss as to what to do. Currently it's taking down my whole app... not just the camera tab. Everything was working great before the switch to Swift3.
Thanks.
I tried to hardcode height and width for VideoEncoder's process,
_encoder = [VideoEncoder encoderForPath:path Height:_cy width:_cx channels:_channels samples:_samplerate];
as follows:
// hard-code resolution
NSDictionary* actual = _videoDataOutput.videoSettings;
// _cy = [[actual objectForKey:@"Height"] floatValue];
// _cx = [[actual objectForKey:@"Width"] floatValue];
_cy = VIDEO_RESOLUTION_HEIGHT;
_cx = VIDEO_RESOLUTION_WIDTH;
unfortunately, the video was distorted, though, it's square, indeed.
if i have an app open thats already playing sound how do i make it so the preview layer doesnt stop?
Hey Remy. I was trying to get the new version into my project, which is developed on minimum iOS 8.0, CameraEngine requires minimum 9.2 to do pod install. Is there a way to fix this?
Why has the iOS 10 been set as minimum deployment target? Can it be changed to at least 9.0 so that iPhone 4 users also could be using my app with 'CameraEngine'?
Hi I am trying to follow the Quick Start on the README.md and have a blank screen when I run the app.
import UIKit
import CameraEngine
class ViewController: UIViewController {
let cameraEngine = CameraEngine()
override func viewDidLoad() {
super.viewDidLoad()
self.cameraEngine.startSession()
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
// Dispose of any resources that can be recreated.
}
override func viewDidLayoutSubviews() {
let layer = self.cameraEngine.previewLayer
layer.frame = self.view.bounds
self.view.layer.insertSublayer(layer, atIndex: 0)
self.view.layer.masksToBounds = true
}
}
Please refer to the code here, any help appreciated
With an upgrade to Cocoapods, the podfile for the example requires concrete targets otherwise the following error results:
[!] The dependency CameraEngine (~> 0.9)
is not used in any concrete target.
The dependency PBJVideoPlayer
is not used in any concrete target.
The dependency FCFileManager
is not used in any concrete target.
The dependency FLAnimatedImage
is not used in any concrete target.
I did a pod init to generate a new podfile with the target:
target 'CameraEngineExample'
Im trying everything to build and try the example proj on my iDevice although its giving me two errors: "FCFileManager module not found" and "CameraEngine module not found".
What is a workaround? Thank you
When I start a session for the first time, it shows a dialog asking the user for a microphone permission. I do not want the microphone to be used and have audio processed, all I want is a camera preview. How can I disable the audio recording?
Thanks
When recording a video, the video starts black for a few frames. It happens rather frequently.
Hi I would like to add a focusing animation but unfortunately I can't get access to the "IsAdjustingFocus"/"isAdjustingExposure"/"isAdjustingWhiteBalance" properties of AVCaptureDevice.
Also due to lack of protected accessor in Swift I can't add access to those properties via extension.
I would really appreciate access to those indicators, but in case you really oppose to adding it to your API - can you think of any solution that doesn't include managing my own fork?
Thanks!
Please update the swift 3.0 branch to current syntax.
Is it possible to get AVCaptureVideoDataOutputSampleBufferDelegate running ?
i got this fatal error
´´´Swift
fatalError("[CameraEngine] error initInputDevice")
´´´
And i print put the error, it was this one:
Error Domain=AVFoundationErrorDomain Code=-11852 "Cannot use iPod Microphone" UserInfo={NSLocalizedDescription=Cannot use iPod Microphone, AVErrorDeviceKey=<AVCaptureFigAudioDevice: 0x14f66450 [iPod Microphone][com.apple.avfoundation.avcapturedevice.built-in_audio:0]>, NSLocalizedFailureReason=This app is not authorized to use iPod Microphone.}
I know I have tuned off the microphone permissions, but this error should not occur if I will only use the camera without video.
for testing I will turn on the microphone, but I should be able to take photos without this permission
In the example, self.frames.removeAll() is called for the action changeModeCapture in ViewController.swift. If you take a gif and go to the preview, but then try and go back to the view controller to take another gif, the frames are not removed.
...Pods/CameraEngine/CameraEngine/CameraEngineMetadataOutput.swift:91:18: Expression pattern of type 'String' cannot match values of type 'String!'
There is an cocoa pods issue with Swift 3
Regards
I am recording a video in portrait mode and for some reason I am getting the video in landscape mode when I view it.
Sometimes, the recorded video has black initial frame. Any clue why?
The camera preview in the example stays in portrait orientation when the phone is in landscape orientation.
the video captured with CameraEngine has a short delay with black screen but with audio, just for about a second.
is it ok there? maybe u can reproduce this issue.
thanks a lot.
Error: fatal error: unexpectedly found nil while unwrapping an Optional value
at this line of code
self.previewLayer.connection.videoOrientation = AVCaptureVideoOrientation.orientationFromUIDeviceOrientation(UIDevice.currentDevice().orientation)
in function
private func handleDeviceOrientation() {
if self.rotationCamera {
UIDevice.currentDevice().beginGeneratingDeviceOrientationNotifications()
NSNotificationCenter.defaultCenter().addObserverForName(UIDeviceOrientationDidChangeNotification, object: nil, queue: NSOperationQueue.mainQueue()) { (_) -> Void in
self.previewLayer.connection.videoOrientation = AVCaptureVideoOrientation.orientationFromUIDeviceOrientation(UIDevice.currentDevice().orientation)
}
}
else {
UIDevice.currentDevice().endGeneratingDeviceOrientationNotifications()
NSNotificationCenter.defaultCenter().removeObserver(self, name: UIDeviceOrientationDidChangeNotification, object: nil)
}
}
@remirobert
Video and Photos taken using the front-facing camera are mirrored from whats shown in the previewLayer
. I know even the iOS camera behaves like this but I prefer the snapchat & instagram style.
How can this be fixed?
UPDATE:
In my preview playback, I unmirrored with
self.player.view.transform = CGAffineTransformMakeScale(-1.0, 1.0)
But this is just for playback, it doesnt record it un-mirrored.
@remirobert
I would like to be able to instantly show a preview of a video taken by the user - just like Snapchat and more impressive, Instagram stories. Somehow once a user stops capturing a video on both apps, a preview automatically starts playing in an endless loop seemingly within the same view.
Can I achieve that with CameraEngine
?
I've implemented this tool as well, but it seems to be crashing when I take videos. Crash happens after around 2,5 seconds of recording because of memory warning.
Any ideas?
When is this going to be converted to Swift 3 and pushed to master? Existing "swift-3.0" branch is not useable with 166 errors.
There seems to be an issue with the swift 3 branch, I can't seem to get it to work... I download the branch via cocoapods, and it seems to be in swift 3 syntax, but it throws all kinds of funky errors.
I can fix the errors and push back a branch if needed?
Will that be possible
Hi, is it possible to use CameraEngine to make SlowMotion videos?
Drew
thanks.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.