Giter VIP home page Giter VIP logo

hap-in-avfoundation's Introduction

HapInAVFoundation Framework

Hap is a video codec for fast decompression on modern graphics hardware. This is the home of the HapInAVFoundation framework. For general information about Hap, see the Hap project.

The HapInAVFoundation framework supports encoding and decoding Hap video. Unlike the QuickTime codec, the HapInAVFoundation framework will only decode to S3TC/DXT frames suitable for upload to graphics hardware at this time. If requested, decoding to RGB(A) pixel formats can be added. Encoding RGB(A) frames is supported. For the most part, this is a port of the hap quicktime codec.

Sample code for a test application ("HapInAVF Test App") that demonstrates the use of this framework for accelerated playback is included.

Download

A compiled version of the framework can be downloaded from the Releases page.

Requires at minimum MacOS 10.10 Yosemite.

Using the HapInAVFoundation.framework

The general idea is to either download or compile the framework, add the framework to your XCode project so you may link against it, and then set up a build phase to copy the framework into your application bundle. This is fairly important: most of the time when you link against a framework, the framework is expected to be installed on your OS. HapInAVFoundation is different: your application will include a compiled copy of the framework. Here's the exact procedure:

If you downloaded a compiled framework

  1. Unzip the framework, add it to your source tree, and then drag the framework into your xcode project file's workspace.
  2. Locate the "Build Phases" section for your project/application's target.
  3. Add the HapInAVFoundation framework to the "Link Binary with Libraries" section.
  4. Create a new "Copy Files" build phase, set its destination to the "Frameworks" folder, and add the HapInAVFoundation framework to this build phase- the goal is to copy the framework you need into the "Frameworks" folder inside your app package.
  5. Switch to the "Build Settings" section of your project's target, locate the "Runpath Search Paths" settings, and make sure that the following paths exist: "@loader_path/../Frameworks" and "@executable_path/../Frameworks".

If you're compiling from source

  1. In XCode, close the HapInAVFoundation project (if it is open), and then open your project.
  2. In the Finder, drag "HapInAVFoundation.xcodeproj" into your project's workspace.
  3. Switch back to XCode, and locate the "Build Phases" section for your project/application's target.
  4. Add a dependency for the "HapInAVFoundation" framework. This will ensure that the framework gets compiled before your project, so there won't be any missing dependencies.
  5. Add the HapInAVFoundation framework to the "Link Binary with Libraries" section.
  6. Create a new "Copy Files" build phase, set its destination to the "Frameworks" folder, and add the HapInAVFoundation framework to this build phase- the goal is to copy the framework you need into the "Frameworks" folder inside your app package.
  7. Switch to the "Build Settings" section of your project's target, locate the "Runpath Search Paths" settings, and make sure that the following paths exist: "@loader_path/../Frameworks" and "@executable_path/../Frameworks".
  8. That's it- you're done now. You can import/include objects from the framework in your source just as you normally would.

Documentation

Documentation for this framework can be found here. The header files are also commented extensively, and the included sample application demonstrates the use of the framework to play back Hap video to GL textures.

Open-Source

The Hap codec project is open-source, licensed under a FreeBSD License, meaning you can use it in your commercial or non-commercial applications free of charge. The hap project was originally written by Tom Butterworth and commissioned by VIDVOX, 2012.

hap-in-avfoundation's People

Contributors

anome avatar bangnoise avatar hiddedejong avatar mrray avatar mto-anomes avatar pixlwave avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

hap-in-avfoundation's Issues

Single frame movie fails due to lastEncodedDuration not having a value

If you try to encode a movie with just one frame it fails due to lastEncodedDuration not having a meaningful value. I propose that if there’s only one buffer when the writer is marked as finished, 1 unit of the buffer’s time scale is used as the duration. I’ve implimented this in my own private fork but hasn’t been rigorously tested yet outside my use case.

File information

Hi. I'm new to this, but I don't understand why -- after I export to hap -- my file (which works fine, I think) doesn't show me any information when I do command-i. Like, usually the size of the file and the codec is there...and it was there in the previous file that I exported from. I need to know this stuff and I don't know how else to get it. Is there a setting I'm missing?

Framework not working with macOS 10.13

The framework doesn't work at all with 10.13, or sometimes it's half working but looks like playing random frames. It doesn't matter which SDK I use for building. When running the test app on 10.13 I get this error message in loop:

HapInAVF Test App[5620:80736] err -12703 at CMBlockBufferGetDataPointer() in -[AVPlayerItemHapDXTOutput _decodeHapDecoderFrame:]

I'll have a look asap

Cannot run on OSX 10.11 (see 12-12-2019 commit)

Hi,

Because OSX 10.11 does not feature os_unfair_lock, HAP decoding crashes.
If I replace this feature by the old OSSpinLock, it works again.

It all started with commit 4226a51 : so a partial revert is the solution.

Should I submit my pull request ? Or the use of os_unfair_lock is a must-have, so OSX 10.11 support should be deprecated ?

Thank you. Philippe

Add support for Metal Graphics and Compute API

New software built on macOS uses the Metal Graphics API. Currently the example app utilizes OpenGL, which is deprecated on macOS. It would be great to look into supporting Metal if possible!

HapPixelBufferTexture.m

Hi

This is probably a non-issue, but I was just wondering if there's a reason why HapPixelBufferTexture included in the test app isn't a part of the main framework? Feel free to close this issue if I'm talking nonsense, I'm only suggesting this as I had to include it along with HapInAVFoundation.framework to implement simple playback support for HAP.

Many thanks

Warnings with snappy

I'm getting a bunch of warnings that are also triggering breakpoints in my app. Should this be happening?

I'm importing the framework into Swift on the latest Xcode.

image

Settings bug: Strip audio resets codec selection

Seems to be a small UI bug possibly. Confirmed in versions 1.5.3 and 1.6.9.1.

To reproduce:

  1. Click Load saved settings and select Hap
  2. Click Settings...
  3. Choose Strip audio tracks from files
  4. Click Close Settings

Upon encoding any file the Hap codec is not used. The resulting file size is huge and the resulting codec is:

Screen Shot 2019-10-31 at 9 49 22 AM

I randomly managed to get the correct codec to apply by clicking around the interface for a bit. I can't seem to reproduce this "workaround" again though, but it went something like this:

  1. Click Load saved settings and select Hap
  2. Click Settings...
  3. Choose Strip audio tracks from files
  4. Select another codec, e.g. H264
  5. Select Hap again
  6. Click Close Settings

This leads me to believe that the bug is somehow tied to the UI. Please let me know if I can provide any further information.

Crash when opening a non-hap Video

Hi. First of all Thanks for all the great work, in this library, Hap has been super helpful to me.

Well, I have an App that plays video, and I'm using Hap In AV Foundation since it came out, it actually works very good, but recently I decided to update to your last version of the library and I started to have a crash when playing non-hap videos, specifically h264 (but I also tried with Photo Jpeg)

Then I tried the Example you provide "HapInAVF Test App" And the same thing happens.
The line of code (210) is this one:

        HapDecoderFrame         *dxtFrame = [hapOutput allocFrameClosestToTime:[hapOutput itemTimeForMachAbsoluteTime:mach_absolute_time()]];

It used to work great, because if you didn't have a Hap Video, that call returned nil, and you could proceed with a normal decoding with regular AVFoundation stuff, but now It crashes there.

Here is a screenshot

screen shot 2015-10-12 at 6 25 42 pm

Sometimes, the crash is a little bit more specific, and the line where crashes is in the dispatch_async(decodeQueue....

corresponding to AVPlayerItemHapDXTOutput.m

    if (!foundExactMatchToTarget)   {
        //  now use GCD to start decoding the frame
        dispatch_async(decodeQueue, ^{
            [self _decodeFrameForTime:n];
        });
    }

Right now a possible workaround would be to use the codec string to know if you are playing a Hap Video, and if that is the case that call works fine, otherwise proceed with a regular AVFoundation stuff.

However, I liked the other approach better, seemed to me more clean, well, If there is anything I can help, please let me know. and Thank you in Advance.

Rendering issue on Intel cards

On Intel cards the decoding of Hap frames goes wrong, see attached picture.

To fix that need to call glFlush() right after glCompressedTexSubImage2D() in HapPixelBufferTexture setDecodedFrame: method.

BTW, this needs to be added to the hap-quicktime-playback-demo project too.

hapavfissue

AVF Batch Exporter crash on Catalina in certain situation (and proposed fix)

We recommend our (EboSuite) users to use the AVF Batch Exporter and some of our users had crashes while converting (as I understood) all the files they tried after upgrading to Catalina.

This was the error mentioned in the crash log:

Crashed Thread:        1  Dispatch queue: VVAVFTranscoder

Exception Type:        EXC_CRASH (SIGABRT)
Exception Codes:       0x0000000000000000, 0x0000000000000000
Exception Note:        EXC_CORPSE_NOTIFY

Application Specific Information:
*** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '*** -[AVAssetWriterInput appendSampleBuffer:] Cannot append sample buffer: First input buffer must have an appropriate kCMSampleBufferAttachmentKey_TrimDurationAtStart since the codec has encoder delay'
abort() called
terminating with uncaught exception of type NSException

The odd thing is that we initially couldn't reproduce it on Mojave and also not on the first Catalina machine we tried, but later could using another machine (testing all the time with the same file that one user provided as a test). I don't understand how this could happen, but well..

I managed to fix this crash by adding a few lines in VVAVTranscoder.m . I check if the first buffer returned for each track has TrimDurationAtStart key and if not I add it (using the asset's duration as time). This fixes it.

Crash when activating OutputAsRGB

The following file makes crash VDMX, Millumin, Qlab, ... The issue may happen when generating the thumbnail, so using OutputAsRGB.
Crash when activating OutputAsRGB.mov.zip
(for info, this fiole was generated with Syphon Recorder)

Small program to reproduce the issue :

        for(int i=0; i<100; i++)
        {
            NSURL *url = [[NSURL alloc] initFileURLWithPath:@"video.mov"];
            AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:url options:nil];
            AVPlayerItemHapDXTOutput *hapOutput = [[AVPlayerItemHapDXTOutput alloc] initWithHapAssetTrack:asset.hapVideoTracks.firstObject];
            hapOutput.outputAsRGB = YES;
            HapDecoderFrame *dxtFrame = [hapOutput allocFrameForTime:kCMTimeZero]; // it will crash there at some point
            NSLog(@"=> %@", dxtFrame);
            [NSThread sleepForTimeInterval:1];
        }

[feature] Throw warning when pixel format is not divisible by 4

So plenty of HAP decoder implementations like Christie Pandoras Box and Dataton WatchOut are picky about pixel formats that are not divisible by 4.

So if the encoder could throw a warning, when the input file doesn't comply to this spec.
Right now I don't have a reference where this is actually specified, so I am also unsure, why some decoders are actually failing on this and some don't. But as a fact Pandoras Box throws a criticla render engine error for non-compliant files and wont render. WatchOut I at least complains in an error as well, but I am not sure if it tries to render though.

BC7-compatible encoders

to the best of my knowledge, libsquish doesn't support BC7 (nor do any of the forks i've looked at so far- somebody please correct me if i'm mistaken). support for the BC7 variant of hap will require use of another/a different texture compression library (potentially replacing libsquish if it supports BC1/4/5, if the metrics are good, and if it would result in a simplified codebase). this issue is meant to provide a place for assembling a list of all potential encoders for consideration.

requirements:

  • support for enc/decoding of BPTC/BC7 UNORM textures
  • bonus: support for enc/decoding of DXT1/BC1, DXT5/BC3, and Alpha RGTC1/BC4

Any interest for an Apple SceneKit integration ?

I got HAP running in SceneKit thru a custom SCNNode class.
Any interest at adding a sample project for that ? I can do a pull request, or have a separate repo altogether...
Only implemented OpenGL , not Metal and right now im bypassing SceneKit SCNGeometry and SCNMaterial ... but it can still be relevant/useful (it is to me as is) and maybe we can start from here and refactor the code to actually use a SCNShadable to use it as a source for any SCNMaterial Property (which would be the proper way to do it) but we would have to use core profile only (not hybrid legacy GL like you're doing right now).

Default Xcode Project Does Not Compile under Xcode 9.4.1

Attempting to compile Xcode project in Xcode 9.4.1 fails, as it does not find the header files for squish and snappy because the include files for squish and snappy are not copied to the $(BUILT_PRODUCTS_DIR).

#include <squish.h> (in squish-c.cpp) -- FAIL
#include <snappy-c.h> (in hap.c)) -- FAIL

The following changes to the include path allows the project to compile.

change
"$(BUILT_PRODUCTS_DIR)/include/snappy"
to
$(PROJECT_DIR)/external/snappy/snappy-mac/build/$(CONFIGURATION)/include/snappy

change
"$(BUILT_PRODUCTS_DIR)/include/squish"
to
$(PROJECT_DIR)/external/squish/build/$(CONFIGURATION)/include/squish

Am I missing something? Does this project require a new version of Xcode?

Batch converter doesn't work on still images

Howdy,

Sorry if I'm doing this incorrectly; I'm new here. The old VVBatchConverter would turn still images (JPEG etc) into Hap files, but I'm unable to import any stills into the AVFBatchConverter. I'm assuming this is intended, but is there any chance that functionality might return?

Thanks for you help!

Text colour in Mojave Dark Mode

Small issue but some of the text fields have their text set to black which makes it more or less invisible when building in Xcode 10 and running on Mojave with Dark Mode enabled.

I think Xcode provides named NSColors which behave correctly in either mode - maybe use one of those.

Apple M1 support (wip)

Hello,

I've been testing support for M1 recently.
I was able to compile and use an universal framework (Intel+M1) via "sse2neon" project that translates SSE Intel instructions into Arm NEON ones. Nothing really fancying, I just replaced the SSE includes by such a code :

#ifdef __x86_64__
   #include <xmmintrin.h>
#else
  #include "sse2neon.h"
#endif

This way, I got an application that decodes HAP (all flavors) on Intel and M1 machines.
Unfortenatly, the performances on M1 machines are poor in comparison of Intel ones : I can barely play 2 x 4K@60 HAP movies without dropping frames. The performances are quite similar whatever I use OpenGL, Metal or even Rosetta. The CPU also tends to be busier than expected.
I've no problem playing multiple 4K@60 H264 movies, so I think the M1 machines are powerful enough, and should be able to play 4 or 5 x 4K@60 HAP movies at minimum.

I'm wondering if "sse2neon" is efficient in this case. Or as ARM chips contain less instructions than x86 ones (from what I understand), if there is some missing instructions that makes HAP less relevant for Apple M1.

What do you think ? Thank you.

AVF Batch Converter fails to copy video tracks

In the AVF Batch Converter app, if you choose "Don't Recompress Video" and then select "Copy video tracks from files" you end up with the resulting file having no video track and being automatically trashed due to an error in the process. I haven't really dug in and spent a ton of time finding an elegant fix, but I have a solution that seems to work. At issue is that in VVAVFTranscoder.m, in transcodeFileAtPath, the variable 'imgBufferIsFineOrIrrelevant' is set to NO if 'tmpImgBuffer' is NULL. I'm assuming there's no 'tmpImgBuffer' when copying a video track, and therefore with 'imgBufferIsFineOrIrrelevant' being set to NO it doesn't copy the track, throws an error, and the resulting output file is automatically trashed. I have added a relatively crude fix that checks for 'transcodeThisTrack' in addition to checking if 'tmpImgBuffer' is NULL. If the track isn't being transcoded - i.e. it is being copied - then 'imgBufferIsFineOrIrrelevant' remains == YES and the process succeeds with no error. This requires adding a local variable for 'transcodeThisTrack'. N.B. I haven't done exhaustive testing on this, but so far seems OK.

Xcode 8 and quicktime.h

Similar to mrRay/vvopensource#23, this framework throws up the following error when compiling with Xcode 8:
Utility.c:29:10: 'QuickTime/QuickTime.h' file not found
Hopefully as it's using AVFoundation instead of QuickTime, it shouldn't be a major issue to fix?

Thanks :)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.