Giter VIP home page Giter VIP logo

js_mse_eme's People

Contributors

abakirov avatar austin226 avatar calvaris avatar dbarattag avatar gmierz avatar jeljeli avatar jiaqzhao avatar kirbysayshi avatar tdedecko avatar whimboo avatar ykristos avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

js_mse_eme's Issues

Turn EME test 6 into audio only

In some embedded platforms test number 6 creates a bunch of video sessions that can lead to exhausting video memory because of hardware limitations. I think the spirit of the test would be the same if we turned it to audio instead of video as you could still create several sessions without involving video memory.

Determinig supportsVideoPerformanceMetrics issue

This code should determine wether the browser supports these type of metrics or not. This if statement returns the wrong value if both getVideoPlaybackQuality and (webkitDecodedFrameCount, webkitDroppedFrameCount) are supported. In case of both being there it should also return true.

I'm trying to run these tests at the moment with a browser which supports both, and I'm not able to because of this.

ClearKeyAudio and Video EME non-WebM tests are broken because of missing encryption inside the files

The files used by the non-WebM versions of the ClearKeyAudio and Video tests are not properly encrypted with ClearKey, hence it is impossible to pass the tests.

VideoStreamYTCenc: ['media/oops_cenc-20121114-145-no-clear-start.mp4', yields system ids
edef8ba9-79d6-4ace-a3c8-27dcd51d21ed Widevine in pssh box
9a04f079-9840-4286-ab92-e65be0885f95 PlayReady in pssh box
58147ec8-0423-4659-92e6-f52c5ce8c3cc ClearKey in pssh box

Of course this works because the file is encrypted with ClearKey, we have the key and we can play. This works for test 1 and test 4.

In the case of test 3 that uses VideoNormalClearKey: ['media/car_cenc-20120827-86.mp4' we are getting:
9a04f079-9840-4286-ab92-e65be0885f95 PlayReady in pssh box
edef8ba9-79d6-4ace-a3c8-27dcd51d21ed Widevine in pssh box
9a04f079-9840-4286-ab92-e65be0885f95 PlayReady in piff box
but no uuid of ClearKey so file cannot be decrypted.

Same happens for test 2 with AudioNormalClearKey: ['media/car_cenc-20120827-8c.mp4'

So there are some mp4 files at streamDef.js that are defined there and are wrongly encrypted which makes them impossible to decrypt.

I think what would be interesting is that you people have a look at streamDef.js and check how the files are encrypted there.

Besides this, I think the uuid for ClearKey changed at the spec and it's different from 58147ec8-0423-4659-92e6-f52c5ce8c3cc.

Detailed readme

Where can I find a detailed readme which has instructions for :

  1. downloading complete repo + submodules
  2. executing steps
  3. The results expected.

Better test filtering for the 2020 ytp tests

we need a way to filter youtube playback subtests by type some devices can't run specific codecs so we need to filter those out.

For example we would line to run all HFR Tests tests excluding VP9

Audio Context is not closed after test fails/succeeds

We're trying to run the MSE Conformance tests in a loop. We've noticed that the audio context on some test cases are not closed properly, thus slowly requesting more resources of the device.

this method should close the context once it is done.

ctx.close(); after runner.succeed() should do the trick.

Race condition in MediaSourceDuration tests

MediaSourceDuration tests first append a 1 MB chunk of audio, then without waiting for it to complete, calls appendVideo():

    var audioXhr = runner.XHRManager.createRequest(audioStream.src,
        function(e) {
      var audioContent = audioXhr.getResponseData();
      audioSb.appendBuffer(audioContent);
      appendVideo();
    });

appendVideo() then appends the video initialization segment and 10 seconds worth of video. Then calls setDuration().

    var appendVideo = function() {
      runner.assert(isNaN(media.duration), 'Initial media duration not NaN');
      media.play();
      appendInit(media, videoSb, videoChain, 0, function() {
        appendUntil(runner.timeouts, media, videoSb, videoChain, 10,
            function() {
          setDuration(5, ms, [videoSb, audioSb], function() {

But this is wrong: At that point you still have no guarantee that the audio append has finished successfully, and changing the duration of a MediaSource requires that none of its SourceBuffers is still updating. This may seem unlikely as it's racing against an XHR download and two other appends, but I have seen it happening a lot. It was causing InvalidStateErrors to be thrown.

I could confirm this by adding a log line:

        appendUntil(runner.timeouts, media, videoSb, videoChain, 10,
          function() {
            self.log(`appended first 10s of video, updatingVideo=${videoSb.updating} updatingAudio=${audioSb.updating}`);

Indeed, updatingAudio was still true, and soon after, the duration change failed. After modifying the test to call appendVideo() only after the audio append has finished, the problem was fixed.

AVSync testcases are broken

I know this repository is dead (has it been made private?) but maybe someone still reads it.

On the latest version (v 20240224), AVSync testcases (MSE core 36 and 37) are currently broken:
Uncaught ReferenceError: HALF_AU_DURATION_IN_SECONDS is not defined
onupdateend https://ytlr-cert.appspot.com/latest/yts.js:10541

The issue seems obvious - HALF_AU_DURATION_IN_SECONDS has been renamed to QUARTERED_AU_DURATION_IN_SECONDS, but the old one is still used in some places.

How to run the downloaded checkout?

I downloaded the testsuite using the download link on the EME test page, but when I run a test, I get XHR length assertions,

[Error] Assert: XHR length is (2884572) which should be (32768)
assert (logger-20180418143048.js:31)
check (logger-20180418143048.js:41)
checkEq (logger-20180418143048.js:49)
getResponseData (xhr-20180418143048.js:91)
(anonymous function) (msutil-20180418143048.js:733)

I'm serving the checkout using python3's http.server module. Are there further configuration options I need to set?

Duration shrinking involving buffered range removal isn't allowed anymore by the MSE spec

The MSE spec has recently suffered some changes. At some point[1] between the 20160503 Candidate Recommendation (CR) version[2] and the 20160705 CR version[3] the duration change algorithm started to have different rules regarding the management of duration cropping.

Some 2016 and 2017 tests such as "23/24. DurationAfterAppendAudio/Video"[4], "27. MediaElementEvents"[5] and "42. MediaSourceDuration"[6] shrink the duration at some point. According to the old spec, that should automatically delete the buffered ranges beyond the crop point. However, according to the new spec, that action should cause a runtime error.

Now that the MSE spec has reached the Recommendation level[7], is there any plan to update the YouTube Conformance Tests to adhere to the latest version of the spec regarding the mentioned issues?

[1] w3c/media-source@b386c1c
[2] https://www.w3.org/TR/2016/CR-media-source-20160503/#duration-change-algorithm
[3] https://www.w3.org/TR/2016/CR-media-source-20160705/#duration-change-algorithm
[4] https://github.com/youtube/js_mse_eme/blob/master/js/tests/2017/conformanceTest.js#L759
[5] https://github.com/youtube/js_mse_eme/blob/master/js/tests/2017/conformanceTest.js#L282
[6] https://github.com/youtube/js_mse_eme/blob/master/js/tests/2017/conformanceTest.js#L1218
[7] https://www.w3.org/TR/media-source/

Abnormal behavior in 2016 EME Conformance tests

(v 20161104093450 )
Hi,
I tested EME for youtube certificate, but there are abnormal behavior.
When I press Enter something, but RunAll has started always . maybe there are some bugs in 11/4 version.
So , I want to test only that failed but I can't.

Even MSE test is same .

please revise this issue.

thanks.

Question regarding car-audio-1MB-trunc.mp4

car-audio-1MB-trunc.mp4 is 1048576 bytes.
The last mdat @ 955849 claims it's size is 157253, but theres only 92727 bytes left to read.
I'm assuming this is intentional due to the file name being trunc but whats the expected behavior of the browser? That part seems unclear.

car-20120827-86.mp4 has buggy tfdt

The second fragment of car-20120827-86.mp4 has its baseMediaDecodeTime in tfdt set to 450450 units (at a timescale of 90000, 5.005s), but the first fragment is actually 450480 units* long (at a timescale of 90000, about 5.005333333s).

* Calculated by multiplying the default_sample_duration by sample_count (all frames use the default duration in trun).

In consequence, the last frame of the first fragment and the first frame of the second fragment have a small yet numerically significant overlap.

In implementations that do not implement frame slicing, when the first fragment is added after the second fragment this will trigger the deletion of the first frame of the second fragment and any frame depending on it. This is the reason 67. AppendH264VideoOutOfOrder fails e.g. on Chrome when run with ?novp9=true.

This muxing error also introduces other subtle problems:

  • Since third and following fragments also have the same problem, the track will have a small drift that increases each fragment against any properly muxed track. This has the potential to cause small audio sync errors in long videos.
  • Some implementations may ignore the tfdt for sequential fragments. In consequence, the same frame in the middle of the file will have different timestamps depending on whether it was loaded in a random-access manner or sequentially after all the other previous frames.

My questions are:

  • Does the actual YouTube serve files affected by this issue?
  • Are implementations expected to deal with it (e.g. by implementing frame slicing) for certification?

Move HFR media files of Playback Performance tests into their own test group

A couple of the 60fps media files especially for testing the playback rate are already listed under the HFR Playback Rate Performance test group, but some are still part of the Playback Performance test group.

It would be great to have the 60fps media files always under their own section. As such could the remaining ones from Playback Performance be moved to HFR Playback Performance? Also having all 60fps media files available for 1x playback rate would be fantastic.

CC'ing @PhilHarnish with whom we discussed the usage of those tests in our CI for Firefox development.

The dot from the test names has been removed

Hello,

Is there a plan to add back the dot for test names ?
On 2019 branch there was something like : PlaybackPerf.H264.1080p30@2X
On 2020 : PlaybackPerfH2641080p30@2X

Now it's harder to read.

Thank you!

Incorrently add BMFF for PlayReady tests and 500 license error

Answering #17 (comment) :

The initData should have the keys as expected for clearKey regardless if we need them or not. This points to a problem with the initData the UA is providing. I suspect that you are using a different EME version other than 0.1b. That would explain why your initData is not the expected initData.

The story begins when I am fixing a problem with PlayReadyH264Video EME test as we were selecting the wrong decryptor. This is irrelevant for now as when I encountered this problem I stashed that code and just forced the selection of the PlayReady decryptor when I hit the JS issue and later the 500. Let's forget about this now then.

I am using v0.1b here, I am not building anything else.

How is the initData going to have the keys expected for ClearKey when we are providing PlayReady initData? FTR: The initData, the are providing for the PR engine, the generated key request and the server answer can be found at http://pastebin.com/neLhSDr4 . Of course this is after applying the fix I provided at #17, otherwise I would get no generateKeyRequest call.

I also was not able to reproduce the 500s on the license server. Can you provide more information with how you are running the tests or if you have made any modifications.

You have the info about the 500 on the pastebin link above.

I am running the following link for the tests http://mylocalipaddress/mstest/2016.html?enablewebm=false&command=run&test_type=encryptedmedia-test&tests=16 which contains changes at http://pastebin.com/w5tSPvsm . As you can see it is only the fix and some useful debugging (part of it was submitted, some was rejected and some accepted and reverted later, but I still find it useful).

I would be happy to look into this further but I need some more information on the failure. A complete stacktrace would be helpful, information on the UA EME version, and information about the 500 on the license server.

Info about the 500, you have it already. For UA EME version we run WebKitForWayland on a Raspberry Pi 2. I don't know what you mean by stacktrace? JS stacktrace? It is difficult to get one cause of having an embedded device but I can tell you that the get onneedkey event handler, then generateKeyRequest is called, that triggers onkeymessage handler that calls licenseManager.acquireLicense that ends up calling licenseManager.requestLicense that answers that I told you and that ends up calling the callback that calls video.addKey with the 500 answer.

At the moment I don't have an immediate issue with the fix you are proposing but I think it is a red herring for the problem you are experiencing. I'm closing this request as a result. Please feel free to file a bug report so we can analyze this problem further.

I opened this issue then. Let's see if it is a red herring or not by analyzing the problem further :)

Receiving 500 error from `https://proxy.staging.widevine.com/proxy` in deferred license request.

I am investigating a failure in the YouTube 2018 EME tests, specifically the test 6. WidevineH264MultiMediaKeySessions. I have inherited some code that has several issues, but the other widevine tests in the 2018 suite I have managed to get working. The problem in this test is receiving HTTP 500 errors from the license server. 6. WidevineH264MultiMediaKeySessions specifies a different license server (https://proxy.staging.widevine.com/proxy) than the other widevine tests (https://dash-mse-test.appspot.com/api/drm/widevine<parameters>), and the HTTP 500 error is specific to 6. WidevineH264MultiMediaKeySessions.

The application flow in our code for the generateRequest API is to first always generate a server certificate request. The hexdump of this message looks like this,

Offset    00 01 02 03 04 05 06 07 08 09 0A 0B 0C 0D 0E 0F
00000000  08 04                                            ..

Our application then receives a message back from the server containing 716 bytes (I can provide the hexdump if it's useful), at this point no surprises... The next step our application takes is to generate a deferred license. When this message is sent to the license server, we receive back status 500 and no further information.

Hexdumps can be provided for these payloads, but I'll omit them for brevity initially.

The questions I have from this are,

  • Is there any public documentation about the APIs for these servers, the format of the messages they accept or anything like that? Is there documentation about it I should have with my licensed widevine CDM?
  • Why are there two different license servers (https://proxy.staging.widevine.com/proxy and https://dash-mse-test.appspot.com/api/drm/widevine<parameters>) in the tests, are there known differences in the types of messages you can send them?
  • The generation of the server certificate and then deferred license is determined by the Widevine CDM, I ran the test through Chrome and Firefox and didn't see the server certificate request generated (I understand they use different CDMs), so I expect this difference in behaviour can partly explain the 500 error, but the semantics of these request types isn't clear to me.

TIA.

Allow ranges for test selection

All test selection related options like tests, and exclude should accept a range of tests. This would help a lot in reducing the amount of values to give for eg. running all Playback Performance tests:

Currently: https://ytlr-cert.appspot.com/2019/main.html?test_type=playbackperf-test&raptor=true&command=run:3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20

Proposal: https://ytlr-cert.appspot.com/2019/main.html?test_type=playbackperf-test&raptor=true&command=run:3-20

Also allowing individual chunks would be nice:

Proposal: https://ytlr-cert.appspot.com/2019/main.html?test_type=playbackperf-test&raptor=true&command=run:3-20,34,81-89

No clearkey pssh in 2015's ClearKeyVideo test

I observed that only playready pssh of length 848 and widevine pssh of length 193 are with 2015 ClearKeyVideo test's video, however, 'drm-preferred-decryption-system-id' is set to ClearKey UUID. Though preferred ID is clearkey but the browser's EME is picking widevine due to there is no respective pssh. Same I observed also in 2016's ClearKeyAudio

Here is the log for reference

0:00:11.988359169 16288  0x169d000 LOG                  qtdemux qtdemux_dump.c:927:qtdemux_node_dump_foreach:   'free', [864], free
0:00:11.988693332 16288  0x169d000 LOG                  qtdemux qtdemux_dump.c:927:qtdemux_node_dump_foreach:   'pssh', [848], protection system specific header
0:00:11.989041661 16288  0x169d000 LOG                  qtdemux qtdemux_dump.c:927:qtdemux_node_dump_foreach:   'pssh', [193], protection system specific header
0:00:11.989374157 16288  0x169d000 LOG                  qtdemux qtdemux_dump.c:927:qtdemux_node_dump_foreach:   'mvex', [40], mvex
0:00:11.989709152 16288  0x169d000 LOG                  qtdemux qtdemux_dump.c:927:qtdemux_node_dump_foreach:     'trex', [32], moovie data
.
.
.
0:00:12.030887798 16288  0x169d000 LOG                  qtdemux qtdemux.c:12975:qtdemux_parse_tree:<qtdemux0> Parsing pssh box.
0:00:12.031255293 16288  0x169d000 DEBUG                qtdemux qtdemux.c:14125:gst_qtdemux_append_protection_system_id:<qtdemux0> Adding cenc protection system ID 9a04f079-9840-4286-ab92-e65be0885f95
0:00:12.031607789 16288  0x169d000 LOG                  qtdemux qtdemux.c:3830:qtdemux_parse_pssh:<qtdemux0> cenc pssh size: 848
0:00:12.032022784 16288  0x169d000 TRACE                qtdemux qtdemux.c:3839:qtdemux_parse_pssh:<qtdemux0> adding protection event for stream 0 and system 9a04f079-9840-4286-ab92-e65be0885f95
0:00:12.032347779 16288  0x169d000 LOG                  qtdemux qtdemux.c:12975:qtdemux_parse_tree:<qtdemux0> Parsing pssh box.
0:00:12.032653609 16288  0x169d000 DEBUG                qtdemux qtdemux.c:14125:gst_qtdemux_append_protection_system_id:<qtdemux0> Adding cenc protection system ID edef8ba9-79d6-4ace-a3c8-27dcd51d21ed
0:00:12.032976105 16288  0x169d000 LOG                  qtdemux qtdemux.c:3830:qtdemux_parse_pssh:<qtdemux0> cenc pssh size: 193
0:00:12.033340267 16288  0x169d000 TRACE                qtdemux qtdemux.c:3839:qtdemux_parse_pssh:<qtdemux0> adding protection event for stream 0 and system edef8ba9-79d6-4ace-a3c8-27dcd51d21ed

2016 EME test issue

I'm observing this error while running PlayReadyH264Video/PlayReadyAacAudio EME tests (YouTube 2016) on WPE browser on HiKey board.

https://yt-dash-mse-test.commondatastorage.googleapis.com/unit-tests/js/harness/main-20180216110138.js:123:16: CONSOLE LOG onNeedKey()
https://yt-dash-mse-test.commondatastorage.googleapis.com/unit-tests/js/lib/eme/encryptedMediaPortability-20180216110138.js:202:19: CONSOLE ERROR TypeError: First argument should be an object

This is because of code https://github.com/youtube/js_mse_eme/blob/master/js/lib/eme/2016/emeManager.js#L88, instead should it be "initData = addBMFFClearKeyID(e.initData, this.licenseManager.kids)" ?

This test does not take into account devices with long pre-roll.

baseTimeDiff = util.ElapsedTimeInS() - video.currentTime;

This line, on devices using Chromium with long video pre-roll, will result in couple of timeupdate events synthesized before any current time update actually comes from underlying native player. This results in baseTimeDiff with little connection to actual time update reporting. And since the test looks for worst delta against wall-clock, all actual changes in PTS may appear to have it rather big, no matter how stable reporting actually is.

One way to account for pre-roll might be to ignore for the synthesized events with

if (video.currentTime === 0.0)
  return;

baseTimeDiff = util.ElapsedTimeInS() - video.currentTime;

and getting the baseline at the first non-zero time reported by the player.

Update the archive for "Download-Media-files" section

Hello,

The archive that can be downloaded from "Download-Media-files" section is very old, back from October 20th 2018. A lot of videos are missing.
Can someone update the archive / URL ?
Can you also check if the "Download-Source" reflects the last 2020 updates ?

Thank you!

Unclear how to set up

I can't figure out how to get these tests running locally, in particular I don't know where to get /third_party files from, or how the typescript files should be compiled.

Can you please add instructions to the readme?

Allow muted playback of video

It would be good to test at least the video playback performance with audio muted. As such a new parameter called muted could be used to set video.muted = true/false before starting to play the video.

MSE Test : 12. AddsourceBuffer

The testcase of AddsourceBuffer is changed a few days ago.
(audio/mp4, mp4a.40.2 -> audio/webm, opus)
LG TV is not supported opus codec.
So We get the fail.
Is there any chance of changing to AAC??

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.