rflex / screcorder Goto Github PK
View Code? Open in Web Editor NEWiOS camera engine with Vine-like tap to record, animatable filters, slow motion, segments editing
License: Apache License 2.0
iOS camera engine with Vine-like tap to record, animatable filters, slow motion, segments editing
License: Apache License 2.0
Hey guys.
I have everything working video wise. But I was wondering if, or maybe how, one can implement the snapchat like filtering of SCFilterGroup with an image, rather than a video.
Any help would be great.
Thanks
I using SCViewController to record video but I don't know how to custom size of video ,ex 640x640
Hey there!
There is a problem when running the camera for the first time. I wasn't able to get any error messages from any of the delegate methods. So I tried listening to as many notification as possible to find out what's going on. For some reason, AVAudioSessionMediaServicesWereResetNotification posts up a notification. I worked around putting up some code to close and reopen the session in order to get it working. However, I still do not know what causes this.
@rFlex
When I finished recording audio, back to the home page, again into the video recording, crash now,reason was
SCAudioVideoRecorderExample[4306:60b] *** -[SCCamera release]: message sent to deallocated instance 0x16de65a0
Hi,
I would like to use this library but the need in my app is only to capture the audio, not the video. Can I still use this library?
Thanks,
Varun
I thought "didBeginRecordSegment" was called when segment recording begins. It's actually called after "didEndRecordSegment".
Hi rFlex,
When the SCRecorder has audio enabled the SCAssetExportSession when exporting the video file with a filter on top of it crashes when it begins to read in the data. Have you seen this?
Thanks
Hey, thanks for the great library. We are using it in something really cool, and i can't wait to show you!
So, to the problem at hand.
When i'm using the exact same code as the example on iOS, i often encounter a writing error on the AVAssetWriter in SCAudioVideoRecorder.m
*** Terminating app due to uncaught exception
'NSInternalInconsistencyException', reason:
'*** -[AVAssetWriter finishWritingWithCompletionHandler:]
Cannot call method when status is 1'
The error occurs on line 266:
- (void) finishWriter:(NSURL*)fileUrl {
#if TARGET_IPHONE_SIMULATOR || TARGET_OS_IPHONE
[self.assetWriter finishWritingWithCompletionHandler:^ { // <-- Here
dispatch_async(self.dispatch_queue, ^{
[self stopInternal];
});
}];
Hope that you can help with this!
*** Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: 'CoreStreamBase<>(0x15e29718 e:ORWEDM p:orwedm s:O): An -observeValueForKeyPath:ofObject:change:context: message was received but not handled.
Key path: loadedTimeRanges
Observed object: <AVPlayerItem: 0x15f30490, asset = <AVURLAsset: 0x15efa980, URL = file:///private/var/mobile/Applications/A3DDAC5A-C04F-4AC0-AA08-E0A931E7EB8E/tmp/4ef635f1-0240-4414-bd13-f60e68e96349-video.mp4>>
Change: {
kind = 1;
new = (
"CMTimeRange: {{0/1 = 0.000}, {222234/44100 = 5.039}}"
);
}
Context: 0x0'
Hey, we got a crash on this. Is this something you are aware of? We could certainly take a look at it our self, if you don't have any quick ideas about what this is. It seems to called after the item has removed the observer, but I'm not that familiar with KVO.
Would be great if the lib would enable to take a UIImage snapshot of the current previewLayer
as renderInContext doesn't work and you have to use methods that might break with the lib
http://stackoverflow.com/questions/3397899/avcapturevideopreviewlayer-taking-a-snapshot
It should support the recording in either landscape or portrait, either by setting in code, or automatically set with the accelerometer info.
If that's difficult to do, then I would appreciate some sample code on how to rotate the video.
Hey,
Thank for your source code!
I have the issue: sometime camera can't open.
I'm looking forward your reply!
Thanks and best regard!
Hey,
Thanks for this awesome library.
I have an issue for recording when coming from background. The session is not get recorded. Issue exists in sample application also. Need help on this.
I'm trying to follow the example app and it looks like the retake button isn't implemented. Looking at the code I see that it only resets the timer label and ensures that a session exists. Can you please provide some sample code to demonstrate the best way to implement a retake on the current session?
To reproduce:
Tap to record some video (segment A)
Tap retake button
Tap to record more video (segment B)
Tap stop
Expected:
The final output consists of only video segment B
Observed:
The final output is both segment A and segment B
We are using SCRecorder and we can reproduce an issue on some devices. We see that when running a session, this line:
https://github.com/rFlex/SCRecorder/blob/master/Library/Sources/SCRecorder.m#L329
Will be in a state where _hasAudio = YES but recordSession.audioInitialized = NO
In this state video will not record. Perhaps this is our implementation but in review it seems quite standard. We use lazy instantiation in our implementation which looks like:
- (SCRecordSession *)session
{
if(!_session) {
_session = [SCRecordSession recordSession];
_session.videoSize = CGSizeMake(320, 320);
_session.suggestedMaxRecordDuration = CMTimeMakeWithSeconds(kMaxRecordingSeconds, 1);
}
return _session;
}
- (SCRecorder *)camera
{
if(!_camera) {
_camera = [SCRecorder recorder];
_camera.recordSession = self.session;
_camera.sessionPreset = AVCaptureSessionPresetInputPriority;
_camera.videoOrientation = AVCaptureVideoOrientationPortrait;
_camera.videoEnabled = YES;
_camera.audioEnabled = YES;
_camera.device = AVCaptureDevicePositionFront;
_camera.flashMode = SCFlashModeOff;
_camera.previewView = self.previewView;
_camera.delegate = self;
}
return _camera;
}
Any ideas why we get this intermittent issue with recording?
I noticed that the encode codes about quality setting have commented a few and not working.
Hi, thank you for your project. It helps me a lot. I want to add a filter on the video. But I can't find the SCFilter.h. Where is it ?
Hi rFlex,
You were able to fix my last problem really quickly and I'm hoping you can do it again. So I am recording a video using your library and when it is done I set the asset to your SCPlayer and then allow the user to filter on top of it. However when I call upon SCAssetExportSession to get the newly filtered video I have run into a crash. If I have audio enabled on the video that is being recorded when it goes to be exported it causes an exception breakpoint in the method exportAsynchronouslyWithCompletionHandler:(void (^)())completionHandler at the line if(! [_reader startReading]) with a bad access and I can jump over the exception break point and it will export but if I run it on production it crashes the app. However, if I either run it with _recorder.audioEnabled = NO or just change
_audioOutput = [self addReader:[audioTracks objectAtIndex:0] withSettings:@{ AVFormatIDKey : [NSNumber numberWithUnsignedInt:kAudioFormatType] }]; to _audioOutput = nil in the SCAssetExportSession it works because the audio is not being processed. Do you have any ideas what may be causing this issue.
Thanks
Hi! rFlex
Why when I set the maximum recording time, has been holding down the record button, but there is an error? If it is intermittent recording wouldn't appear error. (meaning: holding down up down up)
Im getting a slow framerate while visualizing the video feed, As well as the generated video while been played. This is not happening in iPhone 5 or Iphone 4S.
Which property should I configure to help iphone 4 processor get to and optimal video??
Thank you very much.
Whether it can be ported to android platform?
The auto focus goes crazy sometimes, especially in the beginning of the video and there's no apparent way of changing it. The "focusMode" property on an SCRecorder instance is read only, so I can't change it there either.
Hello, @rFlex
thanks for the great source, it works like a charm.
however i have a litte problem : How to do a video recording as instagrm section removed ?,I saw your NSArray+SCAdditions is used for this function?
Hi. First of all, Thanks for @rFlex for providing us this awesome framework! I love to use this framework and modify it.
I was working on my project and came up with an idea to improve this project. I think we need more sophisticated loading view. I use horizontal collection view for playing multiple videos with SCPlayer and SCVideoPlayerView in my project. The videos are loaded from server and cached before they played. But while the videos are downloaded, the loading view is not so responsive and beautiful.
I thought it would be helpful to community wide if we can add video downloading feedback feature with first frame of video screen as a loading view and the alpha value of the view like Mindie.
Thanks!
Hey guys,
I just pushed the brand new implementation. I renamed the project to SCRecorder. A lot of things changed (actually, pretty much everything changed). You will have to learn the library from scratch again, sorry about that :(.
Now let's talk about the good things!
If you have any questions, feel free to ask in this topic!
Cheers,
Simon
Hello :)
First of all, thanks for nice library. I'm using this sources in my project,
and you save my time!
I'm using sources by direct importing your sources which I downloaded
zip file from github in my project.
I want use your sources as cocoapods, but when I update my Podfile and saw
the source files in the pod project, several sources are missing.
There are no sources related filters. I think you've missed your pods.
Please check this issue. Thanks.
Receive call or call from the device through Skype. Now open the application (SCRecorderExamples) and begin recording. Seems like, first segment recorded successfully but it isn't. Can not record again. Navigate to preview, no video will be shown. Delegate methods, recorder:didAppendVideoSampleBuffer: & recorder:didAppendAudioSampleBuffer: are not getting called.
If Skype calls arrives while recording, we can continue recording but there will not be any sound for the video while playing on preview view.
While putting breakpoints here and there I got these two errors.
ERROR: [0x103534000] AVAudioSessionPortImpl.mm:50: ValidateRequiredFields: Unknown selected data source for Port iPhone Microphone (type: MicrophoneBuiltIn)
End record segment -1, error : Error Domain=AVFoundationErrorDomain Code=-11821 "Cannot Decode" UserInfo=0x178271240 {NSUnderlyingError=0x17824b1f0 "The operation couldn’t be completed. (OSStatus error 560226676.)", NSLocalizedFailureReason=The media data could not be decoded. It may be damaged., NSLocalizedDescription=Cannot Decode}
Problems is in iOS7. When you set audioEnabled property of SCRecorder to NO, works perfectly but without sound.
Please look into this issue.
Hello,
THANKS for all the work :)
[[NSFileManager defaultManager] fileExistsAtPath:[_recordSession.outputUrl path]]) is returning NO in "finishSession".
Is that on purpose?
Thanks in advance.
Moved it to pull request
I've been trying many different ways to get a video to be after the recording session. First, I'm having problems actually stopping the session. Calling endRunningSession does nothing and even other methods still doesn't stop the session. Basically didCompleteRecordSession never gets called.
//
// RecordChallengeViewController.m
// Sportsy Beta
//
// Created by Pirate Andy on 8/25/14.
// Copyright (c) 2014 Sportsy. All rights reserved.
//
#import "RecordChallengeViewController.h"
#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>
#import <AssetsLibrary/AssetsLibrary.h>
#import "SCAudioTools.h"
#import "SCRecorderFocusView.h"
#import "SCRecorder.h"
#import "SCRecordSessionManager.h"
#import "SCTouchDetector.h"
#import "SCAssetExportSession.h"
#import "GridView.h"
#define kVideoPreset AVCaptureSessionPresetHigh
@interface RecordChallengeViewController () {
SCRecorder *_recorder;
UIImage *_photo;
SCRecordSession *_recordSession;
}
@property (strong, nonatomic) SCRecorderFocusView *focusView;
@end
@implementation RecordChallengeViewController
- (id)initWithNibName:(NSString *)nibNameOrNil bundle:(NSBundle *)nibBundleOrNil
{
self = [super initWithNibName:nibNameOrNil bundle:nibBundleOrNil];
if (self) {
// Custom initialization
}
return self;
}
- (void)viewDidLoad
{
[super viewDidLoad];
_secondDelay = 0;
_recorder = [SCRecorder recorder];
_recorder.sessionPreset = AVCaptureSessionPreset1280x720;
_recorder.audioEnabled = YES;
_recorder.delegate = self;
_recorder.autoSetVideoOrientation = YES;
UIView *previewView = self.previewView;
_recorder.previewView = previewView;
[self.retakeButton addTarget:self action:@selector(handleRetakeButtonTapped:) forControlEvents:UIControlEventTouchUpInside];
[self.reverseCamera addTarget:self action:@selector(handleReverseCameraTapped:) forControlEvents:UIControlEventTouchUpInside];
//[self.previewView addGestureRecognizer:[[SCTouchDetector alloc] initWithTarget:self action:@selector(handleTouchDetected:)]];
self.focusView = [[SCRecorderFocusView alloc] initWithFrame:previewView.bounds];
self.focusView.recorder = _recorder;
[previewView addSubview:self.focusView];
self.focusView.outsideFocusTargetImage = [UIImage imageNamed:@"record-focus"];
self.focusView.insideFocusTargetImage = [UIImage imageNamed:@"record-focus"];
[_recorder openSession:^(NSError *sessionError, NSError *audioError, NSError *videoError, NSError *photoError) {
NSLog(@"==== Opened session ====");
NSLog(@"Session error: %@", sessionError.description);
NSLog(@"Audio error : %@", audioError.description);
NSLog(@"Video error: %@", videoError.description);
NSLog(@"Photo error: %@", photoError.description);
NSLog(@"=======================");
[self prepareCamera];
}];
GridView *gView = [[GridView alloc] initWithFrame:self.view.frame];
gView.backgroundColor = [UIColor clearColor];
[self.gridContainerView addSubview:gView];
// Do any additional setup after loading the view from its nib.
}
- (void)viewDidAppear:(BOOL)animated {
[super viewDidAppear:animated];
[_recorder startRunningSession];
}
- (void)viewWillDisappear:(BOOL)animated {
[super viewWillDisappear:animated];
[_recorder endRunningSession];
}
- (void) prepareCamera {
if (_recorder.recordSession == nil) {
SCRecordSession *session = [SCRecordSession recordSession];
session.suggestedMaxRecordDuration = CMTimeMakeWithSeconds(5, 10000);
_recorder.recordSession = session;
}
}
- (void)showError:(NSError*)error {
[[[UIAlertView alloc] initWithTitle:@"Something went wrong" message:error.localizedDescription delegate:nil cancelButtonTitle:@"OK" otherButtonTitles:nil] show];
}
- (void)recorder:(SCRecorder *)recorder didCompleteRecordSession:(SCRecordSession *)recordSession {
void(^completionHandler)(NSURL *video, NSError *error) = ^(NSURL *video, NSError *error) {
[[UIApplication sharedApplication] endIgnoringInteractionEvents];
if (error == nil) {
[[[UIAlertView alloc] initWithTitle:@"Saved to camera roll" message:@"" delegate:nil cancelButtonTitle:@"OK" otherButtonTitles:nil] show];
} else {
[[[UIAlertView alloc] initWithTitle:@"Failed to save" message:error.localizedDescription delegate:nil cancelButtonTitle:@"OK" otherButtonTitles:nil] show];
}
};
NSURL *video = [NSURL fileURLWithPath:[NSTemporaryDirectory() stringByAppendingString:@"_output.mp4"]];
SCAssetExportSession *exportSession = [[SCAssetExportSession alloc] initWithAsset:[_recordSession assetRepresentingRecordSegments]];
exportSession.sessionPreset = SCAssetExportSessionPresetHighestQuality;
exportSession.outputUrl = video;
exportSession.outputFileType = AVFileTypeMPEG4;
exportSession.keepVideoSize = YES;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
dispatch_async(dispatch_get_main_queue(), ^{
if (completionHandler != nil) {
completionHandler(video, exportSession.error);
}
return;
});
}];
DLog(@"Recoreed did complete record session");
}
- (void)recorder:(SCRecorder *)recorder didInitializeAudioInRecordSession:(SCRecordSession *)recordSession error:(NSError *)error {
if (error == nil) {
NSLog(@"Initialized audio in record session");
} else {
NSLog(@"Failed to initialize audio in record session: %@", error.localizedDescription);
}
}
- (void)recorder:(SCRecorder *)recorder didInitializeVideoInRecordSession:(SCRecordSession *)recordSession error:(NSError *)error {
if (error == nil) {
NSLog(@"Initialized video in record session");
} else {
NSLog(@"Failed to initialize video in record session: %@", error.localizedDescription);
}
}
- (void)recorder:(SCRecorder *)recorder didBeginRecordSegment:(SCRecordSession *)recordSession error:(NSError *)error {
NSLog(@"Began record segment: %@", error);
}
- (void)recorder:(SCRecorder *)recorder didEndRecordSegment:(SCRecordSession *)recordSession segmentIndex:(NSInteger)segmentIndex error:(NSError *)error {
DLog(@"Ended record segment: %@", error);
}
#pragma mark - Focus
- (void)recorderDidStartFocus:(SCRecorder *)recorder {
[self.focusView showFocusAnimation];
}
- (void)recorderDidEndFocus:(SCRecorder *)recorder {
[self.focusView hideFocusAnimation];
}
- (void)recorderWillStartFocus:(SCRecorder *)recorder {
[self.focusView showFocusAnimation];
}
#pragma mark - Recorder Actions
- (void) handleStopButtonTapped:(id)sender {
SCRecordSession *recordSession = _recorder.recordSession;
if (recordSession != nil) {
[self finishSession:recordSession];
}
}
- (void)finishSession:(SCRecordSession *)recordSession {
[recordSession endRecordSegment:^(NSInteger segmentIndex, NSError *error) {
[[SCRecordSessionManager sharedInstance] saveRecordSession:recordSession];
_recordSession = recordSession;
[self prepareCamera];
}];
}
-(IBAction)delayClick:(id)sender {
CGPoint currentPositionDescription = self.descriptionScrollView.contentOffset;
if(currentPositionDescription.y >= self.descriptionView.frame.origin.y) {
[_descriptionBtn setImage:[UIImage imageNamed:@"record-notes"] forState:UIControlStateNormal];
[self.descriptionScrollView setContentOffset:CGPointMake(self.descriptionView.frame.origin.x, currentPositionDescription.y - self.descriptionView.frame.origin.y) animated:YES];
}
CGPoint currentPosition = self.delayScrollView.contentOffset;
if(currentPosition.y >= self.delayView.frame.origin.y) {
[_delayBtn setImage:[UIImage imageNamed:@"record-delay"] forState:UIControlStateNormal];
[self.delayScrollView setContentOffset:CGPointMake(self.delayView.frame.origin.x, currentPosition.y - self.delayView.frame.origin.y) animated:YES];
} else {
[_delayBtn setImage:[UIImage imageNamed:@"record-delay-on"] forState:UIControlStateNormal];
[self.delayScrollView setContentOffset:CGPointMake(self.delayView.frame.origin.x, self.delayView.frame.origin.y) animated:YES];
}
if(_secondDelay > 0) {
[_delayBtn setImage:[UIImage imageNamed:@"record-delay-on"] forState:UIControlStateNormal];
_secondLabel.text = [NSString stringWithFormat:@"%i secs", _secondDelay];
} else {
_secondLabel.text = @"";
}
}
-(IBAction)micClick:(id)sender {
if(_recorder.audioEnabled) {
[_micBtn setImage:[UIImage imageNamed:@"record-mic"] forState:UIControlStateNormal];
_recorder.audioEnabled = NO;
} else {
[_micBtn setImage:[UIImage imageNamed:@"record-mic-on"] forState:UIControlStateNormal];
_recorder.audioEnabled = YES;
}
[self prepareCamera];
}
-(IBAction)gridClick:(id)sender {
if(self.gridContainerView.hidden) {
[_gridBtn setImage:[UIImage imageNamed:@"record-grid-on"] forState:UIControlStateNormal];
self.gridContainerView.hidden = NO;
} else {
[_gridBtn setImage:[UIImage imageNamed:@"record-grid"] forState:UIControlStateNormal];
self.gridContainerView.hidden = YES;
}
}
-(IBAction)descriptionClick:(id)sender {
CGPoint currentPositionDelay = self.delayScrollView.contentOffset;
if(currentPositionDelay.y >= self.delayView.frame.origin.y) {
[_delayBtn setImage:[UIImage imageNamed:@"record-delay"] forState:UIControlStateNormal];
[self.delayScrollView setContentOffset:CGPointMake(self.delayView.frame.origin.x, currentPositionDelay.y - self.delayView.frame.origin.y) animated:YES];
}
CGPoint currentPosition = self.descriptionScrollView.contentOffset;
if(currentPosition.y >= self.descriptionView.frame.origin.y) {
[_descriptionBtn setImage:[UIImage imageNamed:@"record-notes"] forState:UIControlStateNormal];
[self.descriptionScrollView setContentOffset:CGPointMake(self.descriptionView.frame.origin.x, currentPosition.y - self.descriptionView.frame.origin.y) animated:YES];
} else {
[_descriptionBtn setImage:[UIImage imageNamed:@"record-notes-on"] forState:UIControlStateNormal];
[self.descriptionScrollView setContentOffset:CGPointMake(self.descriptionView.frame.origin.x, self.descriptionView.frame.origin.y) animated:YES];
}
}
-(IBAction)delayItemClick:(id)sender {
UIButton *instanceButton = (UIButton *)sender;
int tag = instanceButton.tag;
_secondDelay = tag;
if(_secondDelay > 0) {
[_delayBtn setImage:[UIImage imageNamed:@"record-delay-on"] forState:UIControlStateNormal];
_secondLabel.text = [NSString stringWithFormat:@"%i secs", _secondDelay];
} else {
_secondLabel.text = @"";
}
//Disable all selected buttons
[_fifteenSecondBtn setSelected:NO];
[_tenSecondBtn setSelected:NO];
[_fiveSecondBtn setSelected:NO];
[_noDelayBtn setSelected:NO];
[instanceButton setSelected:YES];
}
- (void) handleRetakeButtonTapped:(id)sender {
SCRecordSession *recordSession = _recorder.recordSession;
if (recordSession != nil) {
_recorder.recordSession = nil;
// If the recordSession was saved, we don't want to completely destroy it
if ([[SCRecordSessionManager sharedInstance] isSaved:recordSession]) {
[recordSession endRecordSegment:nil];
} else {
[recordSession cancelSession:nil];
}
}
[self prepareCamera];
}
- (void) handleReverseCameraTapped:(id)sender {
[_recorder switchCaptureDevices];
}
-(IBAction)recordPauseClick:(id)sender {
if([_recorder isRecording]) {
[[[UIAlertView alloc] initWithTitle:@"Failed to save" message:@"VIDEO IS RECORDING. ATTEMPTING TO SAVE" delegate:nil cancelButtonTitle:@"OK" otherButtonTitles:nil] show];
[_recordBtn setImage:[UIImage imageNamed:@"record-btn"] forState:UIControlStateNormal];
SCRecordSession *recordSession = _recorder.recordSession;
[recordSession endRecordSegment:nil];
if (recordSession != nil) {
[self finishSession:recordSession];
}
} else {
[_recorder record];
[_recordBtn setImage:[UIImage imageNamed:@"record-btn-stop"] forState:UIControlStateNormal];
}
}
/*
- (void)handleTouchDetected:(SCTouchDetector*)touchDetector {
if (touchDetector.state == UIGestureRecognizerStateBegan) {
[_recorder record];
} else if (touchDetector.state == UIGestureRecognizerStateEnded) {
[_recorder pause];
}
}
*/
- (void)didReceiveMemoryWarning
{
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
}
@end
This is where the error gets thrown.
_reader = [AVAssetReader assetReaderWithAsset:self.inputAsset error:&error];
- (void)exportAsynchronouslyWithCompletionHandler:(void (^)())completionHandler {
NSError *error = nil;
[[NSFileManager defaultManager] removeItemAtURL:self.outputUrl error:nil];
_writer = [AVAssetWriter assetWriterWithURL:self.outputUrl fileType:self.outputFileType error:&error];
EnsureSuccess(error, completionHandler);
_reader = [AVAssetReader assetReaderWithAsset:self.inputAsset error:&error];
EnsureSuccess(error, completionHandler);
pls add rotation support
There doesn't seem to be any pattern with how you access the capture session on your _dispatchQueue
.
For example, on openSession:
, you do this on the current queue:
AVCaptureSession *session = [[AVCaptureSession alloc] init];
_beginSessionConfigurationCount = 0;
_captureSession = session;
But then later on you do:
if (!_captureSession.isRunning) {
dispatch_async(_dispatchQueue, ^{
[_captureSession startRunning];
Why are you accessing the capture session from different queues? At first I ignored it thinking it was harmless, but I noticed that many times when I tried using the recorder, the entire iOS media server would crash, and this notification would be posted:
AVAudioSessionMediaServicesWereResetNotification
I forked your repo, and moved all access of the capture session to its respective queue, and I no longer see this issue.
On a separate note, you don't seem to be handling any of the error notifications, such as:
AVCaptureSessionRuntimeErrorNotification
AVCaptureSessionWasInterruptedNotification
AVAudioSessionMediaServicesWereResetNotification
AVAudioSessionMediaServicesWereLostNotification
Great Library,
But is there a way to change the 1080x1920 to something smaller ?
like Vine.. which are squared videos no very big, and great for Uploading to a server.
because of the file size ...
Also.. I have a question regarding the TEMP directory. When is the /tmp directory flushed ? I see all videos stored there since the very beginning.
Hi rFlex,
I use SCRecorder, it works well, but I want to save video file after recording to cameraroll, but I can't.
Please help me!
Thanks!
I'm sorry, i want ask, is it possible to save the video file in the container 'mp4'
Hi fFlex,
I want to resume capture video, I followed your guide:
// Get a dictionary representation of the record session
// And save it somewhere, so you can use it later!
NSDictionary *dictionaryRepresentation = [recordSession dictionaryRepresentation];
[[NSUserDefaults standardUserDefaults] setObject:dictionaryRepresentation forKey:@"RecordSession"];
// Restore a record session from a saved dictionary representation
NSDictionary *dictionaryRepresentation = [[NSUserDefaults standardUserDefaults] objectForKey:@"RecordSession"];
SCRecordSession *recordSession = [SCRecordSession recordSession:dictionaryRepresentation];
but I can't resume capture video.
do you have any different way?
Thanks!
Hi,
I tried your exampe for iOS for test purpouses. It seems to work great but it doesn't save the video to the CameraRoll. It asks for permission the first time, it says that video has been successfully saved but there is nothing there.
Hi There,
The library is amazing however it would be even better if you could implement a tap to view current frame. eg while viewing a vine, if you tap on the vine it pauses itself and resumes when you tap it again.
Attached is an image of what's happening.
When I rotate the phone, the rest of the views in the XIB rotate and scale properly, but the preview layer basically stays the same. When recording, the video does come out properly, it's just the preview layer.
#import "RecordChallengeViewController.h"
#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>
#import <AssetsLibrary/AssetsLibrary.h>
#import "SCAudioTools.h"
#import "SCRecorderFocusView.h"
#import "SCRecorder.h"
#import "SCRecordSessionManager.h"
#import "SCTouchDetector.h"
#import "SCAssetExportSession.h"
#import "GridView.h"
#import "SBJson.h"
#define kVideoPreset AVCaptureSessionPresetHigh
@interface RecordChallengeViewController () {
SCRecorder *_recorder;
UIImage *_photo;
SCRecordSession *_recordSession;
}
@property (strong, nonatomic) SCRecorderFocusView *focusView;
@end
@implementation RecordChallengeViewController
- (id)initWithNibName:(NSString *)nibNameOrNil bundle:(NSBundle *)nibBundleOrNil
{
self = [super initWithNibName:nibNameOrNil bundle:nibBundleOrNil];
if (self) {
// Custom initialization
}
return self;
}
- (void)viewDidLoad
{
[super viewDidLoad];
[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(didRotate:) name:UIDeviceOrientationDidChangeNotification object:nil];
_secondDelay = 0;
aws = [[SportsyAWS alloc] init];
model = [SportsyModel sharedModel];
analytics = [SportsyAnalytics sharedModel];
_recorder = [SCRecorder recorder];
_recorder.sessionPreset = kVideoPreset;
_recorder.audioEnabled = YES;
_recorder.delegate = self;
_recorder.autoSetVideoOrientation = YES;
UIView *previewView = self.previewView;
previewView.backgroundColor = [UIColor redColor];
_recorder.previewView = previewView;
UITapGestureRecognizer *tapRecognizer = [[UITapGestureRecognizer alloc] initWithTarget:self action:@selector(closeOpenSlides:)];
tapRecognizer.numberOfTapsRequired = 1;
[self.view addGestureRecognizer:tapRecognizer];
[self.retakeButton addTarget:self action:@selector(handleRetakeButtonTapped:) forControlEvents:UIControlEventTouchUpInside];
[self.reverseCamera addTarget:self action:@selector(handleReverseCameraTapped:) forControlEvents:UIControlEventTouchUpInside];
self.focusView = [[SCRecorderFocusView alloc] initWithFrame:previewView.bounds];
self.focusView.recorder = _recorder;
[previewView addSubview:self.focusView];
self.focusView.outsideFocusTargetImage = [UIImage imageNamed:@"record-focus"];
self.focusView.insideFocusTargetImage = [UIImage imageNamed:@"record-focus"];
[_recorder openSession:^(NSError *sessionError, NSError *audioError, NSError *videoError, NSError *photoError) {
NSLog(@"==== Opened session ====");
NSLog(@"Session error: %@", sessionError.description);
NSLog(@"Audio error : %@", audioError.description);
NSLog(@"Video error: %@", videoError.description);
NSLog(@"Photo error: %@", photoError.description);
NSLog(@"=======================");
[self prepareCamera];
}];
GridView *gView = [[GridView alloc] initWithFrame:self.view.frame];
gView.backgroundColor = [UIColor clearColor];
[self.gridContainerView addSubview:gView];
// Do any additional setup after loading the view from its nib.
}
- (void)viewDidAppear:(BOOL)animated {
[super viewDidAppear:animated];
[_recorder startRunningSession];
}
- (void)viewWillDisappear:(BOOL)animated {
[super viewWillDisappear:animated];
[_recorder endRunningSession];
}
- (void) prepareCamera {
if (_recorder.recordSession == nil) {
SCRecordSession *session = [SCRecordSession recordSession];
//session.suggestedMaxRecordDuration = CMTimeMakeWithSeconds(5, 10000);
_recorder.recordSession = session;
}
}
- (void)showError:(NSError*)error {
[[[UIAlertView alloc] initWithTitle:@"Something went wrong" message:error.localizedDescription delegate:nil cancelButtonTitle:@"OK" otherButtonTitles:nil] show];
}
- (void)recorder:(SCRecorder *)recorder didCompleteRecordSession:(SCRecordSession *)recordSession {
[self finishSession:recordSession];
}
- (void)recorder:(SCRecorder *)recorder didInitializeAudioInRecordSession:(SCRecordSession *)recordSession error:(NSError *)error {
if (error == nil) {
NSLog(@"Initialized audio in record session");
} else {
NSLog(@"Failed to initialize audio in record session: %@", error.localizedDescription);
}
}
- (void)recorder:(SCRecorder *)recorder didInitializeVideoInRecordSession:(SCRecordSession *)recordSession error:(NSError *)error {
if (error == nil) {
NSLog(@"Initialized video in record session");
} else {
NSLog(@"Failed to initialize video in record session: %@", error.localizedDescription);
}
}
- (void)recorder:(SCRecorder *)recorder didBeginRecordSegment:(SCRecordSession *)recordSession error:(NSError *)error {
NSLog(@"Began record segment: %@", error);
}
- (void)recorder:(SCRecorder *)recorder didEndRecordSegment:(SCRecordSession *)recordSession segmentIndex:(NSInteger)segmentIndex error:(NSError *)error {
DLog(@"Ended record segment: %@", error);
}
#pragma mark - Focus
- (void)recorderDidStartFocus:(SCRecorder *)recorder {
[self.focusView showFocusAnimation];
}
- (void)recorderDidEndFocus:(SCRecorder *)recorder {
[self.focusView hideFocusAnimation];
}
- (void)recorderWillStartFocus:(SCRecorder *)recorder {
[self.focusView showFocusAnimation];
}
#pragma mark - Recorder Actions
- (void) handleStopButtonTapped:(id)sender {
SCRecordSession *recordSession = _recorder.recordSession;
if (recordSession != nil) {
[self finishSession:recordSession];
}
}
- (void)finishSession:(SCRecordSession *)recordSession {
[recordSession endRecordSegment:^(NSInteger segmentIndex, NSError *error) {
[[SCRecordSessionManager sharedInstance] saveRecordSession:recordSession];
_recordSession = recordSession;
[self prepareCamera];
}];
}
-(IBAction)delayClick:(id)sender {
CGPoint currentPositionDescription = self.descriptionScrollView.contentOffset;
if(currentPositionDescription.y >= self.descriptionView.frame.origin.y) {
[_descriptionBtn setImage:[UIImage imageNamed:@"record-notes"] forState:UIControlStateNormal];
[self.descriptionScrollView setContentOffset:CGPointMake(self.descriptionView.frame.origin.x, currentPositionDescription.y - self.descriptionView.frame.origin.y) animated:YES];
}
CGPoint currentPosition = self.delayScrollView.contentOffset;
if(currentPosition.y >= self.delayView.frame.origin.y) {
[_delayBtn setImage:[UIImage imageNamed:@"record-delay"] forState:UIControlStateNormal];
[self.delayScrollView setContentOffset:CGPointMake(self.delayView.frame.origin.x, currentPosition.y - self.delayView.frame.origin.y) animated:YES];
} else {
[_delayBtn setImage:[UIImage imageNamed:@"record-delay-on"] forState:UIControlStateNormal];
[self.delayScrollView setContentOffset:CGPointMake(self.delayView.frame.origin.x, self.delayView.frame.origin.y) animated:YES];
}
if(_secondDelay > 0) {
[_delayBtn setImage:[UIImage imageNamed:@"record-delay-on"] forState:UIControlStateNormal];
_secondLabel.text = [NSString stringWithFormat:@"%i secs", _secondDelay];
} else {
_secondLabel.text = @"";
}
}
-(IBAction)micClick:(id)sender {
if(_recorder.audioEnabled) {
[_micBtn setImage:[UIImage imageNamed:@"record-mic"] forState:UIControlStateNormal];
_recorder.audioEnabled = NO;
} else {
[_micBtn setImage:[UIImage imageNamed:@"record-mic-on"] forState:UIControlStateNormal];
_recorder.audioEnabled = YES;
}
[self prepareCamera];
}
-(IBAction)gridClick:(id)sender {
if(self.gridContainerView.hidden) {
[_gridBtn setImage:[UIImage imageNamed:@"record-grid-on"] forState:UIControlStateNormal];
self.gridContainerView.hidden = NO;
} else {
[_gridBtn setImage:[UIImage imageNamed:@"record-grid"] forState:UIControlStateNormal];
self.gridContainerView.hidden = YES;
}
}
-(IBAction)descriptionClick:(id)sender {
CGPoint currentPositionDelay = self.delayScrollView.contentOffset;
if(currentPositionDelay.y >= self.delayView.frame.origin.y) {
[_delayBtn setImage:[UIImage imageNamed:@"record-delay"] forState:UIControlStateNormal];
[self.delayScrollView setContentOffset:CGPointMake(self.delayView.frame.origin.x, currentPositionDelay.y - self.delayView.frame.origin.y) animated:YES];
}
CGPoint currentPosition = self.descriptionScrollView.contentOffset;
if(currentPosition.y >= self.descriptionView.frame.origin.y) {
[_descriptionBtn setImage:[UIImage imageNamed:@"record-notes"] forState:UIControlStateNormal];
[self.descriptionScrollView setContentOffset:CGPointMake(self.descriptionView.frame.origin.x, currentPosition.y - self.descriptionView.frame.origin.y) animated:YES];
} else {
[_descriptionBtn setImage:[UIImage imageNamed:@"record-notes-on"] forState:UIControlStateNormal];
[self.descriptionScrollView setContentOffset:CGPointMake(self.descriptionView.frame.origin.x, self.descriptionView.frame.origin.y) animated:YES];
}
}
-(IBAction)delayItemClick:(id)sender {
UIButton *instanceButton = (UIButton *)sender;
int tag = instanceButton.tag;
_secondDelay = tag;
if(_secondDelay > 0) {
[_delayBtn setImage:[UIImage imageNamed:@"record-delay-on"] forState:UIControlStateNormal];
_secondLabel.text = [NSString stringWithFormat:@"%i secs", _secondDelay];
} else {
_secondLabel.text = @"";
}
//Disable all selected buttons
[_fifteenSecondBtn setSelected:NO];
[_tenSecondBtn setSelected:NO];
[_fiveSecondBtn setSelected:NO];
[_noDelayBtn setSelected:NO];
[instanceButton setSelected:YES];
}
- (void) handleRetakeButtonTapped:(id)sender {
SCRecordSession *recordSession = _recorder.recordSession;
if (recordSession != nil) {
_recorder.recordSession = nil;
// If the recordSession was saved, we don't want to completely destroy it
if ([[SCRecordSessionManager sharedInstance] isSaved:recordSession]) {
[recordSession endRecordSegment:nil];
} else {
[recordSession cancelSession:nil];
}
}
[self prepareCamera];
}
- (void) handleReverseCameraTapped:(id)sender {
[_recorder switchCaptureDevices];
}
-(IBAction)recordPauseClick:(id)sender {
if([_recorder isRecording]) {
[_recordBtn setImage:[UIImage imageNamed:@"record-btn"] forState:UIControlStateNormal];
DLog(@"TRYING TO END SESSION");
SCRecordSession *recordSession = _recorder.recordSession;
_recorder.recordSession = nil;
[recordSession endSession:^(NSError *error) {
DLog(@"END SESSION CALL");
if (error == nil) {
NSURL *fileUrl = recordSession.outputUrl;
NSString *videoFileName = [NSString stringWithFormat:@"%@-%@.mp4", [model uid], [model getMongoIdFromDictionary:_challengeDict]];
[aws uploadWithBackgroundThread:fileUrl withFileName:videoFileName];
[self dismissViewControllerAnimated:YES completion:nil];
} else {
DLog(@"%@", error);
}
}];
} else {
if(_secondDelay > 0) {
_delayTimerLabel.text = [NSString stringWithFormat:@"%i", _secondDelay];
_delayTimerView.hidden = NO;
_timer = [NSTimer scheduledTimerWithTimeInterval:1 target:self selector:@selector(timerFired) userInfo:nil repeats:YES];
} else {
[_recorder record];
}
[_recordBtn setImage:[UIImage imageNamed:@"record-btn-stop"] forState:UIControlStateNormal];
}
}
-(void)timerFired {
if(_secondDelay > 0){
_secondDelay-=1;
if(_secondDelay>-1) {
_delayTimerLabel.text = [NSString stringWithFormat:@"%i", _secondDelay];
}
} else {
_delayTimerView.hidden = YES;
[_recorder record];
[_timer invalidate];
}
}
-(void)closeOpenSlides:(UITapGestureRecognizer *)recognizer {
CGPoint currentPositionDelay = self.delayScrollView.contentOffset;
if(currentPositionDelay.y >= self.delayView.frame.origin.y) {
[_delayBtn setImage:[UIImage imageNamed:@"record-delay"] forState:UIControlStateNormal];
[self.delayScrollView setContentOffset:CGPointMake(self.delayView.frame.origin.x, currentPositionDelay.y - self.delayView.frame.origin.y) animated:YES];
}
CGPoint currentPositionDescription = self.descriptionScrollView.contentOffset;
if(currentPositionDescription.y >= self.descriptionView.frame.origin.y) {
[_descriptionBtn setImage:[UIImage imageNamed:@"record-notes"] forState:UIControlStateNormal];
[self.descriptionScrollView setContentOffset:CGPointMake(self.descriptionView.frame.origin.x, currentPositionDescription.y - self.descriptionView.frame.origin.y) animated:YES];
}
}
/*
- (void)handleTouchDetected:(SCTouchDetector*)touchDetector {
if (touchDetector.state == UIGestureRecognizerStateBegan) {
[_recorder record];
} else if (touchDetector.state == UIGestureRecognizerStateEnded) {
[_recorder pause];
}
}
*/
- (void) didRotate:(NSNotification *)notification
{
UIDeviceOrientation orientation = [[UIDevice currentDevice] orientation];
if (orientation == UIDeviceOrientationLandscapeLeft || orientation == UIDeviceOrientationLandscapeRight)
{
_recorder.previewView.frame = CGRectMake(0, 0, [[UIScreen mainScreen] bounds].size.height, [[UIScreen mainScreen] bounds].size.width);
} else {
_recorder.previewView.frame = CGRectMake(0, 0, [[UIScreen mainScreen] bounds].size.width, [[UIScreen mainScreen] bounds].size.height);
}
}
- (void)willRotateToInterfaceOrientation:(UIInterfaceOrientation)toInterfaceOrientation duration:(NSTimeInterval)duration {
//[self prepareCamera];
}
- (BOOL)automaticallyForwardAppearanceAndRotationMethodsToChildViewControllers {
return YES;
}
- (BOOL)shouldAutomaticallyForwardRotationMethods {
return YES;
}
- (void)didReceiveMemoryWarning
{
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
}
@end
For some reason the record session restarts when taking a few clips:
"2014-07-25 19:48:42.315[1210:327713] Initialized audio in record session
2014-07-25 19:48:42.322[1210:327713] Initialized video in record session
2014-07-25 19:48:42.322[1210:327713] Began record segment: (null)
2014-07-25 19:48:42.324[1210:327713] End record segment -1: (null)
2014-07-25 19:48:54.290[1210:327713] Began record segment: (null)
2014-07-25 19:49:07.480[1210:327713] End record segment -1: (null)"
Any ideas?
If there isn't already a method to do so. This would be nice. Thanks!
Hi,
I follow your guide to remove the record segment like this:
[recordSession removeSegmentAtIndex:1 deleteFile:YES];
or
[recordSession removeSegmentAtIndex:1 deleteFile:NO];
I have 2 cases with the error:
I'm looking forward your help!
Thanks and best regard!
When trying to set the video orientation like so:
self.camera.videoOrientation = AVCaptureVideoOrientationLandscapeLeft;
or
[self.camera setVideoOrientation:AVCaptureVideoOrientationLandscapeLeft];
No matter what, it always records as a portrait.
See code below based upon your example:
- (void)viewDidAppear:(BOOL)animated {
[super viewDidAppear:animated];
[self.camera openSession:^(NSError * audioError, NSError * videoError) {
[self prepareCamera];
double delayInSeconds = 3.0;
dispatch_time_t popTime = dispatch_time(DISPATCH_TIME_NOW, delayInSeconds * NSEC_PER_SEC);
dispatch_after(popTime, dispatch_get_main_queue(), ^(void){
[self.camera startRunningSession];
});
}];
}
- (void)viewDidLoad
{
[super viewDidLoad];
counter = 20;
isRecording = NO;
self.navigationController.navigationBar.hidden = YES;
self.camera = [[SCCamera alloc] initWithSessionPreset:AVCaptureSessionPresetHigh];
self.camera.delegate = self;
self.camera.enableSound = YES;
self.camera.previewVideoGravity = SCVideoGravityResizeAspectFill;
UIView *previewView = cameraContainer;
self.camera.previewView = previewView;
self.camera.videoOrientation = AVCaptureVideoOrientationLandscapeLeft;
self.camera.recordingDurationLimit = CMTimeMakeWithSeconds(counter, 1);
self.focusView = [[SCCameraFocusView alloc] initWithFrame:previewView.bounds];
self.focusView.camera = self.camera;
[previewView addSubview:self.focusView];
self.focusView.outsideFocusTargetImage = [UIImage imageNamed:@"camera-icon"];
self.focusView.insideFocusTargetImage = [UIImage imageNamed:@"camera-icon"];
}
- (void) prepareCamera {
if (![self.camera isPrepared]) {
NSError * error;
[self.camera prepareRecordingOnTempDir:&error];
if (error != nil) {
DLog(@"%@", error);
} else {
[self.camera record];
NSTimer *countdownTimer = [NSTimer scheduledTimerWithTimeInterval:1
target:self
selector:@selector(advanceTimer:)
userInfo:nil
repeats:YES];
isRecording = YES;
}
}
}
the App is crashed when i tapped stop button how can i resolve it.
the crash log is.....
Unknown class SCVideoPlayerView in Interface Builder file.
2013-12-20 19:01:47.469 SCAudioVideoRecorderExample[6685:907] -[UIView player]: unrecognized selector sent to instance 0x208944f0
2013-12-20 19:01:47.472 SCAudioVideoRecorderExample[6685:907] *** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '-[UIView player]: unrecognized selector sent to instance 0x208944f0'
*** First throw call stack:
(0x342ef2a3 0x3bfd397f 0x342f2e07 0x342f1531 0x34248f68 0x112d35 0x36116595 0x3616b14b 0x3616b091 0x3616af75 0x3616ae99 0x3616a5d9 0x3616a4c1 0x36158b93 0x36158833 0x10d791 0x10ce7d 0x11d369 0x3c3eb11f 0x3c3ea4b7 0x3c3ef1bd 0x342c2f3b 0x34235ebd 0x34235d49 0x37dfd2eb 0x3614b301 0x10ae5d 0x3c40ab20)
libc++abi.dylib: terminate called throwing an exception
How can I resolve it.
Thanks In Advance.
When merging record segments or even just getting the assetRepresentation, some segments seem to be missing. The recordSegments.count seem correct, and end record segment log seems correct, however the video/asset output seems to be incorrect. Say the count is 4 (segments) the end video will show only 3 of the segments leaving out an appended video segment. I notice this happening when I delete and append segments. Any ideas? or sample project showing correct way to implement delete and add as I may be doing something wrong.
I'm unable to pinpoint the exact location of the problem or to recreate the issue all the time. It seems to be almost random.
I have noticed that on my iPhone 4S there is a problem with switching the camera in between calls to pause and record. It does not matter which camera device you start with - when switching to the other camera and resuming record no video will be captured to the session. I have tried this with my own project as well as the sample project you provided and see the same issue.
This behavior currently works as expected in testing on multiple iPhone 5 devices
One thing to note is that there is no issue when switching the camera during recording. The problem only appears when you have paused the recorder and then do the switch.
Steps to reproduce on iPhone 4S running 7.1.1:
Observed:
No video is recorded by the camera that was switched to during the pause
Hello, @rFlex
thanks for the great source, it works like a charm.
however i have a litte problem : How to use the source alone recording Audio? Can give an example??
Jack
I'm having trouble configuring the recorder to capture and export video cropped to a square aspect ratio. Without expertise in AVFoundation I am having a hard time understanding whether this is currently possible to do. Is this currently a supported feature for iOS?
I can see that Issue #1 relates to this question but it was closed 7 months ago and the codebase has changed since that resolution. Specifically, it seems that the functionality associated with useInputFormatTypeAsOutputType has been rewritten.
I have tried to configure the recorder using the following methods without success:
Any information on this matter would be greatly appreciated.
NSString *const AVCaptureSessionPresetPhoto;
NSString *const AVCaptureSessionPresetHigh;
NSString *const AVCaptureSessionPresetMedium;
NSString *const AVCaptureSessionPresetLow;
NSString *const AVCaptureSessionPreset352x288;
NSString *const AVCaptureSessionPreset640x480;
NSString *const AVCaptureSessionPreset1280x720;
NSString *const AVCaptureSessionPreset1920x1080;
NSString *const AVCaptureSessionPresetiFrame960x540;
NSString *const AVCaptureSessionPresetiFrame1280x720;
I use cocoapod to added : pod 'SCRecorder', '~> 2.0.14'
and i use the methed:
removeSegmentAtIndex: deleteFile:
to remove a segment,
after i removed a segment, and i updateTimeRecordedLabel, and refresh the correct time use under code:
if (_recorder.recordSession != nil) {
currentTime = _recorder.recordSession.currentRecordDuration;
}
self.timeRecordedLabel.text = [NSString stringWithFormat:@"%.2f S", CMTimeGetSeconds(currentTime)];
but the time error, the time was not changed, when i removeSegmentAtIndex twice! the time was changed!!
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.