Giter VIP home page Giter VIP logo

unai.vitc's Introduction

Unai.VITC

VITC example

Unai.VITC is a simple VITC (vertical interval timecode) signal generator, written in C# for .NET 6.0.

Usage

Unai.VITC is a console application: you must use it through a terminal and control how it works by passing arguments to it (like changing the framerate; see below).

If no arguments are specified, the program's default behaviour will write the result on the console's standard output as a raw bitmap stream, resolution 90x2 and 8-bit grayscale pixel format. It will default to a 25 FPS progresive video and will render indefinitely. Also, you will get a bunch of gibberish getting printed on your console, which represent the generated VITC code. Press Ctrl+C on the terminal to stop it.

To save the output to a file, you can redirect it:

Unai.VITC.exe > my-vitc.raw

Or you can also do the same thing using the -o parameter:

Unai.VITC.exe -o my-vitc.raw

Generate a video file

Unai.VITC only generates the raw VITC signal. If you want to take the output and turn it into a video file, you can use FFmpeg. In this case, we can use a simple pipeline to communicate both processes. The command should start like this:

Unai.VITC.exe | ffmpeg.exe -f rawvideo -video_size 90x2 -pixel_format gray -framerate 25 -i - […]

This will make FFmpeg take the VITC raw signal and interpret it like a raw video stream. Now we can finally generate a VITC signal and save it as a video file:

Unai.VITC.exe | ffmpeg.exe -f rawvideo -video_size 90x2 -pixel_format gray -framerate 25 -i - -f mp4 -c:v h264 -pix_fmt yuv420p -vf scale=360:4:flags=neighbor my-vitc.mp4

NOTE: since there are some video codecs that cannot process video streams below n pixels of resolution, you must upscale it. And to avoid a blurry output, make sure it's using nearest neighbor mode when upscaling. Also, make sure you are converting the 8-bit grayscale pixel format to another pixel format if the former is not supported by the output codec (in this case, H.264 uses YUV 4:2:0).

Results

FFmpeg has a video filter called readvitc which allows us to decode VITC lines from a video. All the possible framerates have been tested with this filter, giving the following results:

24 FPS

VITC 24 FPS example

25 FPS

VITC 25 FPS example

29.97 FPS

VITC 29.97 FPS example

30 FPS

VITC 30 FPS example

Arguments

-i: input file

It determines the video file/stream that the VITC lines will be stamped on. This does not overwrite the original file. Use -o to specify the output file. If this option is not specified, no base video is used and instead, only the VITC line is rendered on the output file.

-o: output file

It determines the output file where the result will be saved at. Use - (dash) to indicate the standard output of the console. Default: -.

-f: set pixel format

It indicates the pixel format of both input and output video streams. Default: Grayscale8.

Common values are Grayscale8 and R8G8B8 (24-bit RGB).

-s: set frame size

Sets the input and/or output frame size in pixels. Default: 90x2.

-s WxH where:

  • W is width.
  • H is height.

-fps: set framerate

It allows you to set the frames per second, but the only accepted values are 24, 25, and 30 (29.97). Default: 25.

-fps [24/25/30]

Example: set framerate to 30 (ATSC)

-fps 30

Example: set framerate to 29.97 (NTSC)

-fps 30 -d (see also the -d modifier)

-tc: set initial timecode

It changes the initial timecode. Default: 00:00:00:00.

-tc HH:MM:SS:FF

Example: start timecode at 14:59:00, frame 6

-tc 14:59:00:06

-t: set duration

It determines how long will be the output. Default: indefinitely.

-t HH:MM:SS:FF

Example: only render the first 15 seconds

-t 00:00:15:00

-ev: set an event

You can set an event at a specified time in order to change parameters or variables. EventType must be UserBits, UserBitsClear, and Timecode (case insensitive). Some events may require an additional parameter that must be joined with the EventType field with a = character.

-ev HH:MM:SS:FF EventType=input

UserBits event type

In the VITC signal, there are some reserved bits that can be used to transmit a maximum of 4 custom bytes per frame. Those custom bits are called user bits and can be set at any time with the -ev modifier. To clean the user bits, use UserBitsClear.

UserBitsClear event type

Clears all the user bits. That is, the user bits are filled with zeros.

Timecode event type

It allows you to change the timecode at any time.

Example: set the user bits to "Test" at the beginning

-ev 00:00:00:00 UserBits=Test

Example: change the timecode to 23:00:00:00 when reaching 00:01:00:00

-ev 00:01:00:00 TimeCode=23:00:00:00

-I: interlaced mode

Specifies whether the video is interlaced or not. In the case of being interlaced, the program will generate two VITC lines per frame: one for each field.

-d: Drop-frame mode

Specifies whether use drop-frame time code or not. If 30 FPS video is indicated with this option enabled, then the program will generate a 29.97 FPS VITC instead.

Example with multiple arguments

Create a VITC signal of 10 seconds long. At second 2, set the user bits to "Test" and at second 4, clear them.

Unai.VITC.exe -t 00:00:10:00 -ev 00:00:02:00 UserBits=Test -ev 00:00:04:00 UserBitsClear

Result

VITC example 2

unai.vitc's People

Contributors

unai-d avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

Forkers

harrypm kiroffyt

unai.vitc's Issues

No complied version / Executable

Dear developers,
I am a noob when it comes to programming and I have no clue how to use these c# files.
But I REALLY need this tool for a large scale event where I will do video mapping. (I am a VJ)
If anyone can compile this to an executable for either Mac or Windows that I can use, I would be forever grateful!

FFmpeg's readvitc is not detecting VITCs for all frames output from Unai.VITC

Firstly, love the concept of the Unai.VITC tool. Thank-you for putting this utility together.

I have successfully compiled the tool for macOS x86 using dotnet & .NET3.x using...
$ dotnet publish -r osx-x64 -p:PublishSingleFile=true --self-contained true

However, I am struggling to use the example commands on the README.md to create a result that produce a result that FFmpeg readvitc will recognize from the first frame. Would you be kind enough to share the commands that you used to create the APNG examples?

Here is the command I ran (based on the example in README.md). I have tried various framerates, padded inputs, overlays, unscaled, various scale-factors etc. I have also tried using an intermediary file, rather than a pipe.

# Long command is broken up into separate lines with \ for readability on Github...
#
$ ./Unai.VITC -fps 25 -t "00:01:00:00" -tc "00:00:00:00" \
  | ffmpeg -hide_banner -report \
  -f "rawvideo" -video_size "90x1" -pixel_format "gray" -framerate 25 -i - \
  -filter:v "scale=width=360:height=4:flags=neighbor,readvitc=scan_max=-1,metadata=print:file=readvitc.txt" \
  -t "00:01:00.000" -f "null" - -y
$ cat readvitc.txt

readvitc.txt - unexpected result is the output of readvitc filter showing that no VITC is identified until 4 seconds/100 frames into the file. It then stops detecting at 5 seconds, then starts working at 10 seconds.

lavfi.readvitc.found=0
frame:99   pts:99      pts_time:3.96
lavfi.readvitc.found=0
frame:100  pts:100     pts_time:4
lavfi.readvitc.found=1
lavfi.readvitc.tc_str=00:00:04:00
frame:101  pts:101     pts_time:4.04
lavfi.readvitc.found=1
lavfi.readvitc.tc_str=00:00:04:01
frame:102  pts:102     pts_time:4.08

However, if I run FFmpeg against your animated PNG hosted samples on github, the VITC is detected and decoded from the first frame, as expected. Here is the working.txt result.

$ ffmpeg -hide_banner -report \
-i "https://raw.githubusercontent.com/unai-d/Unai.VITC/main/img/readvitc-pal.png" \
-filter:v "readvitc=scan_max=-1,metadata=print:file=working.txt" \
-f "null" - -y
$ cat working.txt

frame:0    pts:0       pts_time:0
lavfi.readvitc.found=1
lavfi.readvitc.tc_str=00:00:00:00
frame:1    pts:4000    pts_time:0.04
lavfi.readvitc.found=1
lavfi.readvitc.tc_str=00:00:00:01
frame:2    pts:8000    pts_time:0.08
lavfi.readvitc.found=1
lavfi.readvitc.tc_str=00:00:00:02
frame:3    pts:12000   pts_time:0.12

I can imagine how you have used FFmpeg's readvitc, metadata, drawtext filters to create your animated PNGs. I'm familiar with using metadata and drawtext filters.

Would you be kind enough to share your full command for generating the APNG's and try my command on your build, in case it is specific to my build. I'm running a recent version of FFmpeg.

$ ffmpeg -version
ffmpeg version 4.4 Copyright (c) 2000-2021 the FFmpeg developers
built with Apple clang version 12.0.0 (clang-1200.0.32.29)

Thanks for the tool. I've been looking for a tool to generate VITC and embed 608 captions as Line 21 for a few years. Your project is the first open-source vitsee rendering tool. Appreciated.

Incorrect speed on 30FPS

Hi.
Thanks for the nice tool.

I noticed that bit faster output at the 30fps.

The default duration is 1m40s while the output is 1m23s.
The result of readvitc by ffmpeg is also bit faster.

An example of command is below.
Unai.VITC.exe | ffmpeg.exe -f rawvideo -video_size 90x1 -pixel_format gray -framerate 30 -i - -f mp4 -c:v h264 -pix_fmt yuv420p -vf scale=360:4:flags=neighbor vitc90x4-30fps.mp4

At 25fps, works will fine.

Are there any possible causes?

Thanks.

Add a basic Wiki

Congratulations on the first release!

Just some little things I thought I would mention making things easier and or more useful to users.

Accessibility

ffmepg/.net runtimes should be bundled in the release for self-contained operation (avoids the whole chicken and egg archive issue)

Detailed Example Commands & Test Data

Ideally, the readme would be expanded on its okay but it needs more demo examples for people.

  • SMPTE/EBU Colour Bar Tests

  • Commands to take 720x576 PAL & 720x480 NTSC inputs and pad to broadcast standard 720x608 & 720x512 for example.

  • Considerations for commands to generate data suited for ld-chroma-encoder as that allows software side generation of full spec 4fsc composite or s-video data in .TBC and it can be played back to analogue via a 10USD Fl2000 VGA adapter on Windows/Linux FL2K

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.