Giter VIP home page Giter VIP logo

sdavassetexportsession's Introduction

SDAVAssetExportSession

AVAssetExportSession drop-in replacement with customizable audio&video settings.

You want the ease of use of AVAssetExportSession but default provided presets doesn't fit your needs? You then began to read documentation for AVAssetWriter, AVAssetWriterInput, AVAssetReader, AVAssetReaderVideoCompositionOutput, AVAssetReaderAudioMixOutput… and you went out of aspirin? SDAVAssetExportSession is a rewrite of AVAssetExportSession on top of AVAssetReader* and AVAssetWriter*. Unlike AVAssetExportSession, you are not limited to a set of presets – you have full access over audio and video settings.

Usage Example

SDAVAssetExportSession *encoder = [SDAVAssetExportSession.alloc initWithAsset:anAsset];
encoder.outputFileType = AVFileTypeMPEG4;
encoder.outputURL = outputFileURL;
encoder.videoSettings = @
{
    AVVideoCodecKey: AVVideoCodecH264,
    AVVideoWidthKey: @1920,
    AVVideoHeightKey: @1080,
    AVVideoCompressionPropertiesKey: @
    {
        AVVideoAverageBitRateKey: @6000000,
        AVVideoProfileLevelKey: AVVideoProfileLevelH264High40,
    },
};
encoder.audioSettings = @
{
    AVFormatIDKey: @(kAudioFormatMPEG4AAC),
    AVNumberOfChannelsKey: @2,
    AVSampleRateKey: @44100,
    AVEncoderBitRateKey: @128000,
};

[encoder exportAsynchronouslyWithCompletionHandler:^
{
    if (encoder.status == AVAssetExportSessionStatusCompleted)
    {
        NSLog(@"Video export succeeded");
    }
    else if (encoder.status == AVAssetExportSessionStatusCancelled)
    {
        NSLog(@"Video export cancelled");
    }
    else
    {
        NSLog(@"Video export failed with error: %@ (%d)", encoder.error.localizedDescription, encoder.error.code);
    }
}];

Licenses

All source code is licensed under the MIT-License.

sdavassetexportsession's People

Contributors

bryanhatwood avatar gonghao avatar gsabran avatar gshahbazian avatar hakanw avatar jallens avatar jamescmartinez avatar jeffrey903 avatar koush avatar macdoum1 avatar marcvanolmen avatar pnicholls avatar rs avatar samiandoni avatar solomon23 avatar streeter avatar tifroz avatar zakinaeem avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

sdavassetexportsession's Issues

iOS 9.3.1 iPhone 5c - Uncaught exception: NSInvalidArgumentException: -[CADisplayLink integerValue]: unrecognized selector sent to instance 0x1a13ff10

Hi,
This is really stumping me, so any help would be appreciated !

The code was working well on the iPhone 5c, before the update to iOS 9.3.1 and the currently works with other iOS devices (iPhone 6 running iOS 9.3.1, and iPad Air 2 running iOS 9.2.1).

The video is a recording made earlier on the same device and then export from .mov to .mp4 using the following code:

` SDAVAssetExportSession *encoder = [SDAVAssetExportSession.alloc initWithAsset:avAsset];

encoder.outputFileType = AVFileTypeMPEG4;
encoder.outputURL = [NSURL fileURLWithPath:destPath];
encoder.shouldOptimizeForNetworkUse = YES;
encoder.videoSettings = @{
    AVVideoCodecKey: AVVideoCodecH264,
    AVVideoWidthKey: videoWidthHeightKey,
    AVVideoHeightKey: videoWidthHeightKey,
    AVVideoCompressionPropertiesKey: @
    {
        AVVideoAverageBitRateKey: @3000000,
        AVVideoProfileLevelKey: AVVideoProfileLevelH264BaselineAutoLevel,
    }
};
`

encoder.audioSettings = @{ AVFormatIDKey: @(kAudioFormatMPEG4AAC), AVNumberOfChannelsKey: @2, AVSampleRateKey: @44100, AVEncoderBitRateKey: @128000, };

encoder.videoComposition = videoComposition;

[encoder exportAsynchronouslyWithCompletionHandler:^
{
    switch ([encoder status]) {

        case AVAssetExportSessionStatusFailed:
            [[NSOperationQueue mainQueue] addOperationWithBlock:^ {
                [self ReportError: formatConversionFailed];
            }];
            break;

        case AVAssetExportSessionStatusCancelled:
            break;

        default:

....`

The exception appears to occur in the exportAsynchronouslyWithCompletionHandler:(void (^)())handler
on the line (however, I can't see what's wrong with that - and it works fine with other devices):

self.videoInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:self.videoSettings];

The Xcode output log is:

-[CADisplayLink integerValue]: unrecognized selector sent to instance 0x1a13ff10 2016-04-23 00:43:10.273 appname[8240:2536741] Uncaught exception: NSInvalidArgumentException: -[CADisplayLink integerValue]: unrecognized selector sent to instance 0x1a13ff10 ( 0 CoreFoundation 0x24289ba3 <redacted> + 150 1 libobjc.A.dylib 0x23a46dff objc_exception_throw + 38 2 CoreFoundation 0x2428f4d5 <redacted> + 0 3 CoreFoundation 0x2428d12b <redacted> + 702 4 CoreFoundation 0x241b7358 _CF_forwarding_prep_0 + 24 5 AVFoundation 0x29cdfae9 <redacted> + 1620 6 AVFoundation 0x29cdf075 <redacted> + 60 7 AVFoundation 0x29cd9663 <redacted> + 334 8 AVFoundation 0x29cdde77 <redacted> + 42 9 AVFoundation 0x29cd9663 <redacted> + 334 10 AVFoundation 0x29c4f3bd <redacted> + 340 11 AVFoundation 0x29c4f263 <redacted> + 30 12 AVFoundation 0x29c4f1d5 <redacted> + 44 13 testappplayscotland01 0x010992dd -[SDAVAssetExportSession exportAsynchronouslyWithCompletionHandler:] + 3590

Bad video cropping

Hi,

When trying to re-encode a MOV to MP4 with example from readme, I get borders on left and right with rear camera device, and a square crop with front camera.

Without any re-encoding, my player (PBJVideoPlayerController, using AVLayerVideoGravityResizeAspectFill) displays correctly the fullscreen video.

I guess there is something wrong with renderSize of SDAVAssetExportSession, but I don't really understand why nobody else seems to have the problem.

Here are screenshots with front and rear camera after re-encoding :

Screenshot of video without re-encoding, rear camera: http://imgur.com/SyoQcDF (perfectly fullscreen)
Screenshot of video with re-encoding, rear camera : http://imgur.com/DLFbAeW
Screenshot of video with re-encoding, front camera : http://imgur.com/8WelhWF

As you can see, the player shouldn't be the problem as everything is ok without re encoding. But after exporting, borders appears with rear, and I-don't-know-really-know-what-happens with front camera...

Any help ?

Thanks a lot !

PS : Like my coffee bowl ?

Edit : Cloned this to stackoverflow( http://stackoverflow.com/questions/33957150/strange-cropping-behaviour ), I will update both pages according to answers

Issue with portrait video

Hi, First of all thank you for your work, it is a great drop-in replacement !

Compressing Landscape video works great, fast, and painless...
However Portrait video does not seems to be as painless ;-)

I'm having issue with result video being strechted, but more than a thousand words here the original and result screens...

Original
720x1280

Result
result

Has you can see, is it streched down on the width, and streched up on the height... I cannot seems to find out why...

My code using your class is :

    AVAsset *asset = [AVAsset assetWithURL:[info objectForKey:@"UIImagePickerControllerMediaURL"]];
    AVAssetTrack *videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];

    CGSize renderSize = [PMVideoTools getVideoSize:asset]; // Aware of orientation

    NSLog(@"Transfer RenderSize %f - %f", renderSize.width, renderSize.height);

    SDAVAssetExportSession *encoder = [SDAVAssetExportSession.alloc initWithAsset:asset];
    encoder.outputFileType = AVFileTypeMPEG4;
    encoder.outputURL = [PMVideoTools createURLWithFileName:@"output.mp4"];

    NSLog(@"Exporting to : %@", encoder.outputURL.path);
    encoder.videoSettings = @
    {
    AVVideoCodecKey: AVVideoCodecH264,
    AVVideoWidthKey: [NSNumber numberWithFloat:(renderSize.width)],
    AVVideoHeightKey: [NSNumber numberWithFloat:(renderSize.height)],
    AVVideoCompressionPropertiesKey: @
        {
        AVVideoAverageBitRateKey: [NSNumber numberWithInt:videoTrack.estimatedDataRate],
        AVVideoProfileLevelKey: AVVideoProfileLevelH264BaselineAutoLevel,
        },
    };

    encoder.audioSettings = @
    {
        AVFormatIDKey: @(kAudioFormatMPEG4AAC),
        AVNumberOfChannelsKey: @2,
        AVSampleRateKey: @44100,
        AVEncoderBitRateKey: @128000,
    };

Is there something wrong with my code ?

Crash on [self.writer endSessionAtSourceTime:lastSamplePresentationTime];

About 1 in 100 of our users' video exports crashes in

(void) finish;

at:

[self.writer endSessionAtSourceTime:lastSamplePresentationTime];

with the exception:

NSInvalidArgumentException: *** -[AVAssetWriter endSessionAtSourceTime:] invalid parameter not satisfying: ((Boolean)(((endTime).flags & (kCMTimeFlags_Valid | kCMTimeFlags_ImpliedValueFlagsMask)) == kCMTimeFlags_Valid))

If my logic is correct, it appears that to cause this crash, lastSamplePresentationTime.flags would have to contain one of the following:

kCMTimeFlags_PositiveInfinity
kCMTimeFlags_NegativeInfinity
kCMTimeFlags_Indefinite

(Flag reference from: CMTime Reference)

Any idea if this appears to be a bug in implementation of the library or some sort of timing discrepancy in the composition?

BTW -- Awesome library and a real time saver for us!

Thanks!
Clay

Strange cropping after setVideoMirrored

I use AVCaptureMovieFileOutput to record videos.
SDAVAssetExportSession works fine, I use it to compress my recorded videos.
An issue appear when I setVideoMirrored to true , SDAVAssetExportSession crop the video.

code setVideoMirrored

        AVCaptureMovieFileOutput *movieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
        AVCaptureConnection *connection = [movieFileOutput connectionWithMediaType:AVMediaTypeVideo];
        [connection setVideoMirrored:true];

code SDAVAssetExportSession

 AVURLAsset *avAsset = [AVURLAsset URLAssetWithURL:myVideoURL options:nil];

    NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
    NSString* videoPath = [NSString stringWithFormat:@"%@/video.mov", [paths objectAtIndex:0]];
    NSURL * myNewVideoUrl = [NSURL fileURLWithPath:videoPath];

    [[NSFileManager defaultManager] removeItemAtURL: myNewVideoUrl error:nil];

    exportSession_convertVideo2 = [SDAVAssetExportSession.alloc initWithAsset:avAsset];
    exportSession_convertVideo2.outputFileType = AVFileTypeQuickTimeMovie;
    exportSession_convertVideo2.shouldOptimizeForNetworkUse = YES;
    exportSession_convertVideo2.outputURL = myNewVideoUrl; 
    exportSession_convertVideo2.videoSettings = @
    {
    AVVideoCodecKey: AVVideoCodecH264,
    AVVideoWidthKey: [NSNumber numberWithFloat: 540.0],
    AVVideoHeightKey: [NSNumber numberWithFloat: 960.0], 
    AVVideoCompressionPropertiesKey: @
     {
        AVVideoAverageBitRateKey: @1000000,
        AVVideoProfileLevelKey: AVVideoProfileLevelH264High40,
        },
    };
    exportSession_convertVideo2.audioSettings = @
    {
    AVFormatIDKey: @(kAudioFormatMPEG4AAC),
    AVNumberOfChannelsKey: @2,
    AVSampleRateKey: @44100,
    AVEncoderBitRateKey: @128000,
    };


    [exportSession_convertVideo2 exportAsynchronouslyWithCompletionHandler:^
    {
        if (exportSession_convertVideo2.status == AVAssetExportSessionStatusCompleted)
        {
            NSLog(@"Video export succeeded"); 
        }
        else if (exportSession_convertVideo2.status == AVAssetExportSessionStatusCancelled)
        {
            NSLog(@"Video export cancelled"); 
        }
        else
        {
            NSLog(@"Video export failed with error: %@ (%ld)", exportSession_convertVideo2.error.localizedDescription, (long)exportSession_convertVideo2.error.code);
        }
    }];

original
img_0494

after use SDAVAssetExportSession
capture d ecran 2016-02-16 a 15 40 04

processing .mp4 file generates a black screen video with audio.

I am processing this video:

screen shot 2017-11-22 at 12 07 51

  NSURL *inputUrl = [NSURL fileURLWithPath:inputPath];
  AVURLAsset *inputUrlAsset = [AVURLAsset assetWithURL:inputUrl];

  NSURL *outputUrl = [NSURL fileURLWithPath:outputPath];
  
  SDAVAssetExportSession *encoder = [SDAVAssetExportSession.alloc initWithAsset:inputUrlAsset];
  encoder.outputFileType = AVFileTypeMPEG4;
  encoder.outputURL = outputUrl;
  //  1280x720
  encoder.videoSettings = @
  {
  AVVideoCodecKey: AVVideoCodecH264,
  AVVideoWidthKey: @720,
  AVVideoHeightKey: @1280,
  AVVideoCompressionPropertiesKey: @
    {
    AVVideoAverageBitRateKey: @512000,
    AVVideoProfileLevelKey: AVVideoProfileLevelH264High40,
    },
  };
  encoder.audioSettings = @
  {
  AVFormatIDKey: @(kAudioFormatMPEG4AAC),
  AVNumberOfChannelsKey: @2,
  AVSampleRateKey: @44100,
  AVEncoderBitRateKey: @96000,
  };
  
  [encoder exportAsynchronouslyWithCompletionHandler:^
  {
    if (encoder.status == AVAssetExportSessionStatusCompleted)
    {
      NSLog(@"Video export succeeded");
    }
    else if (encoder.status == AVAssetExportSessionStatusCancelled)
    {
      NSLog(@"Video export cancelled");
    }
    else
    {
      NSLog(@"Video export failed with error: %@ (%d)", encoder.error.localizedDescription, encoder.error.code);
    }
  }];

The output video is a black screen with the audio.
I am able to process .mov files without problem.
Does anyone know where the problem lies?

Quality-based compression?

I am compressing a video for upload using SDAVAssetExportSession:

SDAVAssetExportSession *exportSession = [[SDAVAssetExportSession alloc] initWithAsset:composition];

exportSession.videoSettings =
@{
  AVVideoCodecKey: AVVideoCodecH264,
  AVVideoWidthKey: @640,
  AVVideoHeightKey: @640,
  AVVideoCompressionPropertiesKey: @
      {
      AVVideoAverageBitRateKey: @1100000,
      AVVideoProfileLevelKey: AVVideoProfileLevelH264High40,
      },
  };

When encoding video using Handbrake on OSX I can choose to use a "Constant Quality" setting instead of average bitrate:

enter image description here

This seems to give better results on a variety of video sources.

Can I do the same with SDAVAssetExportSession?

If not, is there another way I could compress the video using "Constant Quality" compression?

(post referred from http://stackoverflow.com/questions/36864682/sdavexportsession-quality-based-compression)

-[AVAssetWriterInput requestMediaDataWhenReadyOnQueue:usingBlock:] Cannot call method when status is 0

Hi I'm using the library with following configuration setting for compressing the video files:

    SDAVAssetExportSession *encoder = [SDAVAssetExportSession.alloc initWithAsset:urlAsset];
    encoder.outputFileType = AVFileTypeMPEG4;
    encoder.outputURL = TempLowerQualityFileURL;
    encoder.shouldOptimizeForNetworkUse = YES;

    CGSize size = [[[urlAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] naturalSize];
    NSNumber *videoWidth = @(size.width);
    NSNumber *videoHeight = @(size.height);

    NSDictionary *videoCleanApertureSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                                videoWidth, AVVideoCleanApertureWidthKey,
                                                videoHeight, AVVideoCleanApertureHeightKey,
                                                [NSNumber numberWithInt:10], AVVideoCleanApertureHorizontalOffsetKey,
                                                [NSNumber numberWithInt:10], AVVideoCleanApertureVerticalOffsetKey,
                                                nil];


    NSDictionary *codecSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                   [NSNumber numberWithInt:1960000], AVVideoAverageBitRateKey,
                                   [NSNumber numberWithInt:24],AVVideoMaxKeyFrameIntervalKey,
                                   videoCleanApertureSettings, AVVideoCleanApertureKey,
                                   nil];



    NSDictionary *videoCompressionSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                              AVVideoCodecH264, AVVideoCodecKey,
                                              codecSettings,AVVideoCompressionPropertiesKey,
                                              videoWidth, AVVideoWidthKey,
                                              videoHeight, AVVideoHeightKey,
                                              nil];

    encoder.videoSettings = videoCompressionSettings;

It is crashing at -exportAsynchronouslyWithCompletionHandler: on line no 214 which is:
[self.audioInput requestMediaDataWhenReadyOnQueue:self.inputQueue usingBlock:

But when I try to replace the above line:
encoder.outputFileType = AVFileTypeMPEG4;
with
encoder.outputFileType = AVFileTypeQuickTimeMovie;

It works but size of the compressed video is getting increased by 200% or more.

Video export failed with error: Cannot Open (-11829)

first,i use the AVCaptureSession and movieFileOutput to get a movie . then , use your code to change the movie . the error showed usually on the flowing . wish to get your help,thanks.

Video export failed with error: Cannot Save (-11823)
Video export failed with error: The operation could not be completed (-11800)
Video export failed with error: Cannot Open (-11829)

*** -[AVAssetWriter endSessionAtSourceTime:] Cannot call method when status is 1

Exception: *** -[AVAssetWriter endSessionAtSourceTime:] Cannot call method when status is 1
0 CoreFoundation (null)
1 libobjc.A.dylib (null)
2 AVFoundation (null)
3 AVFoundation (null)
4 Veme (null)
5 Veme (null)
6 AVFoundation (null)
7 libdispatch.dylib (null)
8 libdispatch.dylib (null)
9 libdispatch.dylib (null)
10 libdispatch.dylib (null)
11 libdispatch.dylib (null)
12 libsystem_pthread.dylib (null)
13 libsystem_pthread.dylib (null)
}

AVAssetReader status Failed after [AVAssetReader startReading]

This error is occurring only on IOS version 10 (Also tested on ios version 9 and 11 where it is working fine ) for a 3GP video shared from Iphone gallery.

Following is the video and audio Settings dictionary that I am using:
VideoSettings:
@{AVVideoCodecKey: AVVideoCodecH264,
AVVideoWidthKey: @(fabs(dimensions.width)),
AVVideoHeightKey: @(fabs(dimensions.height)),
AVVideoCompressionPropertiesKey: @{AVVideoAverageBitRateKey: @(3000000),
AVVideoProfileLevelKey: AVVideoProfileLevelH264High40}}

AudioSettings:
{AVFormatIDKey: @(kAudioFormatMPEG4AAC),
AVNumberOfChannelsKey: 2,
AVSampleRateKey: 44100,
AVEncoderBitRateKey: 128000}

I have also tried passing nil in both audio and video settings but still not working. I tried passing nil because AVAssetReader startReading method validates audio and video settings, so I wanted to rule out the possibility of wrong audio and video settings.

Progress skips around

I'm using a KVO to detect progress and using it for a progress bar. I'm noticing the progress bar skipping around and looking at the values i see why:

0.972635
0.974072
0.975509
0.99885
0.976945
0.978382

My video contains mix compositions of multiple audio tracks and is a crop of the total length of video. Could that effect this ?

it Crashed when i export a short audio (1-2 seconds)

it seems that the member variable "lastSamplePresentationTime" cause the problem.

* Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '* -[AVAssetWriter endSessionAtSourceTime:] invalid parameter not satisfying: ((Boolean)(((endTime).flags & (kCMTimeFlags_Valid | kCMTimeFlags_ImpliedValueFlagsMask)) == kCMTimeFlags_Valid))'
*** First throw call stack:
(
0 CoreFoundation 0x0000000107bc0c65 __exceptionPreprocess + 165
1 libobjc.A.dylib 0x000000010782abb7 objc_exception_throw + 45
2 AVFoundation 0x0000000103f4baed -[AVAssetWriter cancelWriting] + 0
3 LordofRap 0x0000000102f311a2 -[SDAVAssetExportSession finish] + 466
4 LordofRap 0x0000000102f2fbca __68-[SDAVAssetExportSession exportAsynchronouslyWithCompletionHandler:]_block_invoke160 + 362
5 AVFoundation 0x0000000103f5ed2e -[AVAssetWriterInputMediaDataRequester requestMediaDataIfNecessary] + 88
6 libdispatch.dylib 0x000000010849a186 _dispatch_call_block_and_release + 12
7 libdispatch.dylib 0x00000001084b9614 _dispatch_client_callout + 8
8 libdispatch.dylib 0x00000001084a06a7 _dispatch_queue_drain + 2176
9 libdispatch.dylib 0x000000010849fcc0 _dispatch_queue_invoke + 235
10 libdispatch.dylib 0x00000001084a33b9 _dispatch_root_queue_drain + 1359
11 libdispatch.dylib 0x00000001084a4b17 _dispatch_worker_thread3 + 111
12 libsystem_pthread.dylib 0x0000000108826637 _pthread_wqthread + 729
13 libsystem_pthread.dylib 0x000000010882440d start_wqthread + 13
)

ERROR *** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '*** -[AVAssetReaderVideoCompositionOutput initWithVideoTracks:videoSettings:] invalid parameter not satisfying: [videoTracks count] >= 1'

Hi There,

i encountered an error, i hope you will be able to tell me what's going wrong.

thanks

Des

  • (void)mediaPicker:(MPMediaPickerController *)mediaPicker didPickMediaItems:(MPMediaItemCollection *)mediaItemCollection {

    [self dismissViewControllerAnimated:YES completion:nil];

    for (MPMediaItem* item in mediaItemCollection.items)
    {
    NSString* title = [item valueForProperty:MPMediaItemPropertyTitle];
    self.assetURL = [item valueForProperty:MPMediaItemPropertyAssetURL];
    NSString* albumTitle = [item valueForProperty:MPMediaItemPropertyAlbumTitle];
    NSString* artist = [item valueForProperty:MPMediaItemPropertyArtist];
    NSString* genre = [item valueForProperty:MPMediaItemPropertyGenre];
    NSLog(@"title: %@, url: %@ %@ %@ %@", title, self.assetURL,albumTitle,artist,genre);
    }

    self.asset = [AVAsset assetWithURL:self.assetURL];

    // Path of your destination save audio file

    NSArray *paths = NSSearchPathForDirectoriesInDomains( NSDocumentDirectory, NSUserDomainMask ,YES );
    NSString *documentsDir = [paths objectAtIndex:0];
    NSString *savedPath = [documentsDir stringByAppendingPathComponent:[NSString stringWithFormat:@"audio.m4a"]];

    NSLog(@"cacheDir %@",savedPath);
    [self exportTrimAudio11:self.asset toFilePath:savedPath];

}

  • (BOOL)exportTrimAudio11:(AVAsset *)avAsset toFilePath:(NSString *)filePath
    {
    // we need the audio asset to be at least 50 seconds long for this snippet
    CMTime assetTime = [avAsset duration];
    Float64 duration = CMTimeGetSeconds(assetTime);
    if (duration < 50.0) return NO;
    // create trim time range - 30 seconds starting from 10 seconds into the asset
    CMTime startTime = CMTimeMake(10, 1);
    CMTime stopTime = CMTimeMake(40, 1);
    CMTimeRange exportTimeRange = CMTimeRangeFromTimeToTime(startTime, stopTime);
    SDAVAssetExportSession *encoder = [[SDAVAssetExportSession alloc] initWithAsset:avAsset];
    encoder.outputFileType = AVFileTypeAppleM4A;
    encoder.outputURL=[NSURL fileURLWithPath:filePath];
    encoder.timeRange = exportTimeRange;
    //encoder.shouldOptimizeForNetworkUse = YES;
    encoder.audioSettings = @
    {
    AVFormatIDKey: @(kAudioFormatMPEG4AAC),
    AVNumberOfChannelsKey: @2,
    AVSampleRateKey: @44100,
    AVEncoderBitRateKey: @128000,
    };

    // perform the export
    [encoder exportAsynchronouslyWithCompletionHandler:^{

    if (AVAssetExportSessionStatusCompleted == encoder.status) {
        NSLog(@"AVAssetExportSessionStatusCompleted");
    } else if (AVAssetExportSessionStatusFailed == encoder.status) {
        // a failure may happen because of an event out of your control
        // for example, an interruption like a phone call comming in
        // make sure and handle this case appropriately
        NSLog(@"AVAssetExportSessionStatusFailed");
    } else {
        NSLog(@"Export Session Status: %d", encoder.status);
    
    }
    

    }];

    return YES;
    }

Export video in background

Hi, is there a way to make this work (at least in the allowed 3 minutes) in the background in case I hit the home button while export is being made?

Crash when set audioSettings to nil

From SDAVAssetExportSession.h,

/**
 * The settings used for encoding the audio track.
 *
 * A value of nil specifies that appended output should not be re-encoded.
 * The dictionary’s keys are from <CoreVideo/CVPixelBuffer.h>.
 */
@property (nonatomic, copy) NSDictionary *audioSettings;

So I guess it means that if audioSettings is set to nil, original audio settings from asset will be used, right?
But the result is that, if audioSettings is set to nil, it crashes. Did I misunderstand the behavior?

Set framerate of exported video

I have a 240fps 720p video recorded on the iPhone's internal camera which I'm trying to compress while maintaining the frame rate and resolution:

SDAVAssetExportSession *exportSession = [[SDAVAssetExportSession alloc] initWithAsset:asset];

exportSession.videoSettings =
@{
  AVVideoCodecKey: AVVideoCodecH264,
  AVVideoWidthKey: @1280,
  AVVideoHeightKey: @720,
  AVVideoCompressionPropertiesKey: @
      {
      AVVideoAverageBitRateKey: @1100000,
      AVVideoProfileLevelKey: AVVideoProfileLevelH264High40,
      },
  };

exportSession.audioSettings = @
{
AVFormatIDKey: @(kAudioFormatMPEG4AAC),
AVNumberOfChannelsKey: @2,
AVSampleRateKey: @44100,
AVEncoderBitRateKey: @128000,
};

This works, but the video is outputted as a 30fps file. Can I get SDAVAssetExportSession to output at 240fps (or just to keep the same framerate as input)?

Hyperlapse videos

i am using the latest - 0.0.2. i am trying to transcode videos to a smaller size to upload to a server. working great with videos taken from the built-in camera, regardless of orientation - i swap my width and height based on a check i do to see whether video was taken in portrait/landscape.

the only use case this doesn't work with is when i use source videos that were filmed with Instagram's Hyperlapse app. for some reason that i can't figure out, the transforms for those videos are slightly different than the ones from the built-in camera. the a,b,c,d values match up when filmed in the different orientations, but the tx and ty translation values are not the same as when filmed using the built-in camera app. for example, the built-in camera app, when filming in landscape with the home button on the left, produces a transform that looks like this :
transform a = -1.0 , b = 0.0 , c = 0.0 , d = -1.0 , tx = 640.0 , ty = 360.0
but, when filmed using Hyperlapse, the same orientation, the transform looks like this :
transform a = -1.0 , b = 0.0 , c = 0.0 , d = -1.0 , tx = 0.0 , ty = 0.0
the translations are always set to 0, and this produces a black video when i transcode.

if i manually set the tx and ty values in the "buildDefaultVideoComposition" when assigning the transform to the passThroughLayer, then it works fine.

any ideas? could this be a bug in the transform for the Hyperlapse videos or am i missing something?
thanks.

Export Session Without videoComposition Crashes

Since the videoOutput is an instance of AVAssetReaderVideoCompositionOutput, where it requires to have videoComposition property, if the client doesn't set this property the app will crash at runtime.
So when the client doesn't set the videoComposition property, either use AVAssetReaderTrackOutput as videoOutput, or set the videoComposition property to default composition [AVMutableVideoComposition videoComposition] with required values from the asset's video track. I prefer the later.

What's the expected default crop/resize behavior?

This is a question about the scope and design of this library, not an issue or question about the code itself.

Just curious, what is the expected crop and resize behavior (in the default videoComposition) when you pass in a AVVideoWidthKey and AVVideoHeightKey to videoSettings?

I know a couple cropping and resize issues have cropped up recently (pun intended) and I've run into some myself. It would be helpful if the intended default behavior was explicitly documented so that, for example, when black borders appear on the edges of a video I know whether that's expected behavior or a bug.

Also, I'm currently in the process of editing the default video composition code to suit my own needs. I'd be happy to PR when it's done, but I worry that I might "break" or change the current default behavior because I'm not really sure what it's supposed to be.

Relatedly, would supporting multiple crop/resize behaviors (selectable via the currently ignored AVVideoScalingModeKey in videoSettings, perhaps) be within the scope of this project?

Thanks!

XCode 9: This block declaration is not a prototype

XCode 9 with the recommended settings issues a warning for 3 lines. Suggested fixes:

In the .h file

-- (void)exportAsynchronouslyWithCompletionHandler:(void (^)())handler;
+- (void)exportAsynchronouslyWithCompletionHandler:(void (^)(void))handler;

In the .m file

-@property (nonatomic, strong) void (^completionHandler)();
+@property (nonatomic, strong) void (^completionHandler)(void);

and

-- (void)exportAsynchronouslyWithCompletionHandler:(void (^)())handler
+- (void)exportAsynchronouslyWithCompletionHandler:(void (^)(void))handler

@rs Would there be any problem to change these lines with the recommended fixes?

CVPixelBufferLockBaseAddress and CVPixelBufferUnlockBaseAddress should not be called.

hi,

I'm not sure why CVPixelBufferLockBaseAddress is called here:

CVPixelBufferLockBaseAddress(renderBuffer, 0);
[self.delegate exportSession:self renderFrame:pixelBuffer withPresentationTime:lastSamplePresentationTime toBuffer:renderBuffer];
CVPixelBufferUnlockBaseAddress(renderBuffer, 0);

in our case we use the buffer only the GPU in that scenario there is no need to call, Apple doc has even an important warning against that.
I feel it should not done by default by the library.

Here is the information from official apple doc:

https://developer.apple.com/library/prerelease/ios/documentation/QuartzCore/Reference/CVPixelBufferRef/index.html#//apple_ref/c/func/CVPixelBufferLockBaseAddress

You must call the CVPixelBufferLockBaseAddress function before accessing pixel data with the CPU, and call the CVPixelBufferUnlockBaseAddress function afterward. If you include the kCVPixelBufferLock_ReadOnly value in the lockFlags parameter when locking the buffer, you must also include it when unlocking the buffer.

IMPORTANT

When accessing pixel data with the GPU, locking is not necessary and can impair performance.

Transform for front facing camera results in half of the video missing

I have video that is coming in from the front facing camera with the following properties:

Property Value
natural size width 1280.000000
natural size height 720.000000
target size width 720.000000
target size height 1280.000000
preferredTransform a: 0.000000 b: 1.000000 c: 1.000000 d: 0.000000 tx: 0.000000 ty: 0.000000

Which results in a video missing half of its content:

simulator screen shot 5 apr 2017 6 58 24 am

I would love to fix this myself but I have yet to get a good grasp on how the transforms work. Any tips?

Cheers!

SDAVAssetExportSession takes a long time

SDAVAssetExportSession compress & export a video taken from photo library (about 300M)takes about 40s,but sysytem AVAssetExportSession takes only 20s..who can help me to find out?
settings:

        QUINT64 bitrate = (naturalSize.width * naturalSize.height * 3); 
        NSDictionary *videoSettings = @{
                                        AVVideoCodecKey: AVVideoCodecH264,
                                        AVVideoWidthKey: CZ_NSNumber(naturalSize.width),
                                        AVVideoHeightKey: CZ_NSNumber(naturalSize.height),
                                        AVVideoCompressionPropertiesKey: @
                                            {
                                              AVVideoAverageBitRateKey: @(bitrate)
                                            }
                                        };
        exportSession.videoSettings = videoSettings;
        
        exportSession.audioSettings = @
        {
            AVFormatIDKey: CZ_NSNumber(kAudioFormatMPEG4AAC),
            AVNumberOfChannelsKey: CZ_NSNumber(1),
            AVSampleRateKey: @(44100),
            AVEncoderBitRateKey:  @(64000),
        };

Wrong dimensions for AVAssetWriterInputPixelBufferAdaptor while using portrait videos

Scenario:
Encoding a portrait video without custom video composition and manipulating frames using SDAVAssetExportSessionDelegate

Issue:
The manipulated frames end-up written to file horizontally squeezed.

Reason: The AVAssetWriterInputPixelBufferAdaptor is being initialised with video track's natural size, ignoring the preferred transform.

Issue in landscape exported video.

Hi, I'm having an issue when I trying to export a video in landscape mode.
This is the original video
screen shot 2016-06-09 at 11 15 51 am
And the exported video is cut in half.
screen shot 2016-06-09 at 11 16 56 am

my configuration for the encoder is this
`let encoder = SDAVAssetExportSession(asset: sourceAsset)
encoder.outputFileType = AVFileTypeMPEG4;
encoder.outputURL = savePathUrl;

encoder.videoSettings = [
    AVVideoCodecKey: AVVideoCodecH264,
    AVVideoWidthKey: 900,
    AVVideoHeightKey: 506,
    AVVideoCompressionPropertiesKey: [

        AVVideoAverageBitRateKey: 725000,
        AVVideoProfileLevelKey: AVVideoProfileLevelH264High40,
    ]
]

encoder.audioSettings = [
    AVFormatIDKey: NSNumber(int: Int32(kAudioFormatMPEG4AAC)),
    AVNumberOfChannelsKey: 2,
    AVSampleRateKey: 44100,
    AVEncoderBitRateKey: 64000
]
encoder.shouldOptimizeForNetworkUse = true

`
the original video is 1280x720.

Any suggestions? Thanks.

I get error as -[AVAssetWriter startSessionAtSourceTime:] invalid parameter not satisfying: CMTIME_IS_NUMERIC(startTime) when using sdavexportsession

func sdvExport(_ videoURL: URL)
{
let avAsset1 = AVURLAsset(url: videoURL, options: nil)
print("AVURLAsset1 ",avAsset1)
let avAsset = AVURLAsset(url: ((videoURL as NSURL) as URL), options: nil)
print("AVURLAsset2 ",avAsset)
let startDate = Foundation.Date()

    //Create Export session
   // exportSessionSDA = SDAVAssetExportSession(asset: avAsset, presetName: AVAssetExportPresetPassthrough)
    exportSessionSDA = SDAVAssetExportSession(asset: avAsset)

    // exportSession = AVAssetExportSession(asset: composition, presetName: mp4Quality)
    //Creating temp path to save the converted video


    let documentsDirectory = NSSearchPathForDirectoriesInDomains(.documentDirectory, .userDomainMask, true)[0]
    let myDocumentPath = URL(fileURLWithPath: documentsDirectory).appendingPathComponent("temp.mp4").absoluteString
    let url = URL(fileURLWithPath: myDocumentPath)

    let documentsDirectory2 = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)[0] as URL

    let filePath = documentsDirectory2.appendingPathComponent("rendered-Video.mp4")
    deleteFile(filePath)

    //Check if the file already exists then remove the previous file
    if FileManager.default.fileExists(atPath: myDocumentPath) {
        do {
            try FileManager.default.removeItem(atPath: myDocumentPath)
        }
        catch let error {
            print(error)
        }
    }



    exportSessionSDA!.outputURL = filePath
    exportSessionSDA!.outputFileType = AVFileTypeMPEG4
    exportSessionSDA!.shouldOptimizeForNetworkUse = true

    exportSessionSDA.videoSettings = [AVVideoCodecKey: AVVideoCodecH264, AVVideoWidthKey: 800, AVVideoHeightKey: 600, AVVideoCompressionPropertiesKey: [AVVideoAverageBitRateKey: 6000000, AVVideoProfileLevelKey: AVVideoProfileLevelH264High40]]
    exportSessionSDA.audioSettings = [AVFormatIDKey: kAudioFormatMPEG4AAC, AVNumberOfChannelsKey: 2, AVSampleRateKey: 44100, AVEncoderBitRateKey: 128000]

    let start = CMTimeMakeWithSeconds(0.0, 0)
    let range = CMTimeRangeMake(start, avAsset.duration)
    exportSessionSDA.timeRange = range

    exportSessionSDA!.exportAsynchronously(completionHandler: {() -> Void in
        switch self.exportSessionSDA!.status {
        case .failed:
            print("%@",self.exportSessionSDA?.error)
        case .cancelled:
            print("Export canceled")
        case .completed:
            //Video conversion finished
            let endDate = Foundation.Date()

            let time = endDate.timeIntervalSince(startDate)
            print(time)
            print("Successful!")
            guard let data = NSData(contentsOf: (self.singleTonFile?.fileUrl)!) else {
                return
            }

            print("File size before compression: \(Double(data.length / 1048576)) mb")
            print("elf.exportSession.outputURL= ",self.exportSessionSDA.outputURL)
            guard let compressedData = NSData(contentsOf: self.exportSessionSDA.outputURL!) else {
                return
            }

            print("File size after compression: \(Double(compressedData.length / 1048576)) mb")

            let when = DispatchTime.now() + 8

            DispatchQueue.main.asyncAfter(deadline: when)
            {
                print("video_duration", self.singleTon.video_duration)
                self.videoUrl() // calling upload
            }
            // self.mediaPath = self.exportSession.outputURL?.path as NSString!
            //  self.singleTon.mediaUrl = URL(self.exportSession.outputURL?.path)
            // print("self.singleTon.mediaUrl ",self.singleTon.mediaUrl)
            //self.mediaPath = String(self.exportSession.outputURL!)
        // self.mediaPath = self.mediaPath.substringFromIndex(7)
        default:
            break
        }

    })
}

ASAVAssetExportSession is hanging

Seems to hang about 30% of the time here:

  • (BOOL)encodeReadySamplesFromOutput:(AVAssetReaderOutput *)output toInput:(AVAssetWriterInput *)input

line 225

CMSampleBufferRef sampleBuffer = [output copyNextSampleBuffer];

It'll just be stuck there and hang forever. The other odd thing is it always seems to be 98%.

No audio

encoder.audioSettings = @
{
AVFormatIDKey: @(kAudioFormatMPEG4AAC),
AVNumberOfChannelsKey: @2,
AVSampleRateKey: @44100,
AVEncoderBitRateKey: @128000,
};

doesn't return audio in the export while the same audioMix return audio in the classic AVAssetExportSession

Crash while export

Hello,
This project is great (y), but I faced with few crashes on my application:

*** Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: '*** 
-[AVAssetWriterInput appendSampleBuffer:] Must start a session (using -[AVAssetWriter startSessionAtSourceTime:) before appending sample buffers'
*** Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: '***
 -[AVAssetWriterInput markAsFinished] Cannot call method when status is 0'
*** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '*** 
-[AVAssetReader initWithAsset:error:] invalid parameter not satisfying: asset != ((void *)0)'

Config is:

    SDAVAssetExportSession *session = [[SDAVAssetExportSession alloc] initWithAsset:composition];
    session.videoComposition = videoMix;
    session.audioMix = audioMix;
    session.videoSettings = @{
        AVVideoCodecKey : AVVideoCodecH264,
        AVVideoWidthKey : @640,
        AVVideoHeightKey : @640,
        AVVideoCompressionPropertiesKey : @{
            AVVideoAverageBitRateKey : @2000000,
            AVVideoProfileLevelKey : AVVideoProfileLevelH264Baseline31,
        }
    };
    session.audioSettings = @{
        AVFormatIDKey : @(kAudioFormatMPEG4AAC),
        AVNumberOfChannelsKey : @2,
        AVSampleRateKey : @44100,
        AVEncoderBitRateKey : @64000,
    };

    session.outputURL = [[NSURL alloc] initFileURLWithPath:outPath];
    session.outputFileType = AVFileTypeMPEG4;
    session.shouldOptimizeForNetworkUse = YES;

Plus I'm using custom videoComposition and audioMix

They happens rarely - hard to reproduce, but.. any thoughts why this happens? Recommendations?

Doesn't work on iOS simulator anymore? (iOS 8.3)

Hello,

I'm trying to use SDAVAssetExportSession in the iOS Simulator, but it's not working any more. I'm calling encoder.exportAsynchronouslyWithCompletionHandler(), but the completion handler is never called. The same code works fine on my device (iPhone 4S running iOS 8).

Anyone else experiencing the same issue?

getting "AVAssetReaderOutput does not currently support compressed output" Exception

I am using Xcode 7 beta 5 and am getting a strange error with code that follows the realm and a sample in stack overflow. I am getting:

*** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '*** -[AVAssetReaderVideoCompositionOutput initWithVideoTracks:videoSettings:] AVAssetReaderOutput does not currently support compressed output'

Anyone see this or have any idea what is going on? I tried playing with the options but no solution.

Thanks

Can you provide a demo?

I was the first time to do the video transcoding related parts of the function, but I can not be a good implementation of video transcoding based on your tools. It makes me confused.
Thank you.

License?

What source code license has this code been published under?

Create git tags

Please mark 9ade7f0 as 0.0.2, and probably the current master as 0.0.3. Then I could push your repository to CocoaPods trunk :)

Videos rotated -90 degrees scaled too much

I had a video with a rotation of -90 degrees, the preferredTransform is (a = 0, b = -1, c = 1, d = 0, tx = 0, ty = 720). The video was being scaled too much. Looking in to the code, videoAngleInDegree in buildDefaultVideoComposition gets assigned a value between -180 and 180 which becomes -90 for my video. However, the following comparison checks for 90 or 270 and my video doesn't match, resulting in a video that is scaled incorrectly.

    CGAffineTransform transform = videoTrack.preferredTransform;
    CGFloat videoAngleInDegree  = atan2(transform.b, transform.a) * 180 / M_PI;
    if (videoAngleInDegree == 90 || videoAngleInDegree == 270) {
        CGFloat width = naturalSize.width;
        naturalSize.width = naturalSize.height;
        naturalSize.height = width;
    }

Problem in above code seems to be that videoAngleInDegree can be -90 but not 270. Seems odd that this hasn't come up so I am probably missing something!

I've created a fork and will submit a pull request in case. Thanks!

crash while setting the AVVideoCodecH264

* Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '* -[AVAssetWriterInput initWithMediaType:outputSettings:sourceFormatHint:] For compression property ProfileLevel, video codec type avc1 only allows the following values: H264_Baseline_1_3, H264_Baseline_3_0, H264_Baseline_3_1, H264_Baseline_4_1, H264_Main_3_0, H264_Main_3_1, H264_Main_3_2, H264_Main_4_0, H264_Main_4_1, H264_Main_5_0, H264_High_5_0, H264_Baseline_AutoLevel, H264_Main_AutoLevel, H264_High_AutoLevel'

i have used the all the above and it still crashing in the ios 6

export fail with AVVideoProfileLevelH264Baseline30

Hi,
I tried to encode a video got back from UIImagePickerController, and the following doesn't work :

- (IBAction)takeVideo:(id)sender
{
    UIImagePickerController *ipc = [[UIImagePickerController alloc] init];
    NSArray *availableTypes = [UIImagePickerController
                               availableMediaTypesForSourceType:UIImagePickerControllerSourceTypeCamera];
    ipc.mediaTypes = availableTypes;
    ipc.sourceType = UIImagePickerControllerSourceTypeCamera;
    [ipc setVideoMaximumDuration:10];
    ipc.delegate = self;
    ipc.videoQuality = UIImagePickerControllerQualityTypeMedium;

    if ([availableTypes containsObject:(__bridge NSString *)kUTTypeMovie]) {
        [ipc setMediaTypes:@[(__bridge NSString *)kUTTypeMovie]];
    }
    [self presentViewController:ipc animated:YES completion:nil];
}

- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
    SDAVAssetExportSession *encoder = [SDAVAssetExportSession.alloc initWithAsset:[AVAsset assetWithURL:[info objectForKey:UIImagePickerControllerMediaURL]]];
    NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
    NSString *documentsDirectory = [paths objectAtIndex:0];
    NSString *myPathDocs =  [documentsDirectory stringByAppendingPathComponent: [NSString stringWithFormat:@"lowerBitRate-%d.mov",arc4random() % 1000]];

    NSURL *url = [NSURL fileURLWithPath:myPathDocs];
    encoder.outputFileType = AVFileTypeMPEG4;
    encoder.outputURL = url;
    encoder.videoSettings = @

    {
    AVVideoCodecKey: AVVideoCodecH264,
    AVVideoWidthKey: @1920,
    AVVideoHeightKey: @1080,
    AVVideoCompressionPropertiesKey: @
        {
        AVVideoAverageBitRateKey: @600000,
        AVVideoProfileLevelKey: AVVideoProfileLevelH264Baseline30,
        },
    };
    encoder.audioSettings = @
    {
    AVFormatIDKey: @(kAudioFormatMPEG4AAC),
    AVNumberOfChannelsKey: @2,
    AVSampleRateKey: @44100,
    AVEncoderBitRateKey: @128000,
    };

    [encoder exportAsynchronouslyWithCompletionHandler:^
     {
         int status = encoder.status;

         if (status == AVAssetExportSessionStatusCompleted)
         {
             AVAssetTrack *videoTrack = nil;
             AVURLAsset *asset = [AVAsset assetWithURL:encoder.outputURL];
             NSArray *videoTracks = [asset tracksWithMediaType:AVMediaTypeVideo];
             videoTrack = [videoTracks objectAtIndex:0];
             float frameRate = [videoTrack nominalFrameRate];
             float bps = [videoTrack estimatedDataRate];
             NSLog(@"Frame rate == %f",frameRate);
             NSLog(@"bps rate == %f",bps/(1024.0 * 1024.0));
             NSLog(@"Video export succeeded");
             // encoder.outputURL <- this is what you want!!
         }
         else if (status == AVAssetExportSessionStatusCancelled)
         {
             NSLog(@"Video export cancelled");
         }
         else
         {
             NSLog(@"Video export failed with error: %@", encoder.error);
         }
     }];
}

The crash I get :
Video export failed with error: Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo=0x17427d100 {NSUnderlyingError=0x1740545b0 "The operation couldn’t be completed. (OSStatus error -12902.)", NSLocalizedFailureReason=An unknown error occurred (-12902), NSLocalizedDescription=The operation could not be completed}

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.