Giter VIP home page Giter VIP logo

network-hardware-video-encoder's Introduction

NHVE Network Hardware Video Encoder C library

Library for hardware video encoding and streaming over custom MLSP protocol.

See also twin NHVD network decoder.

See hardware-video-streaming for other related projects.

The intent behind library:

  • minimize video latency
  • minimize CPU usage (hardware encoding and color conversions)
  • multi-frame streaming (e.g. depth + texture)
  • auxiliary data channels (e.g. IMU, odometry, metadata)
  • simple user interface

If you have no specific needs you should start with gstreamer or FFmpeg command line streaming instead.

Platforms

Unix-like operating systems (e.g. Linux). The dependency is through MLSP socket use (easily portable).

Tested on Ubuntu 18.04.

Hardware

Intel VAAPI compatible hardware encoders (Quick Sync Video).

ATI/AMD may also work through VAAPI (libva-mesa-driver, not tested however).

The dependency is through HVE implementation (see HVE issues).

Tested on LattePanda Alpha and i7-7820HK laptop.

Dependencies

Library depends on:

HVE and MLSP are included as submodules so you only need to satifisy HVE dependencies.

Works with system FFmpeg on Ubuntu 18.04 and doesn't on 16.04 (outdated FFmpeg and VAAPI ecosystem).

Building Instructions

Tested on Ubuntu 18.04.

# update package repositories
sudo apt-get update 
# get avcodec and avutil
sudo apt-get install ffmpeg libavcodec-dev libavutil-dev libavfilter-dev
# get compilers and make 
sudo apt-get install build-essential
# get cmake - we need to specify libcurl4 for Ubuntu 18.04 dependencies problem
sudo apt-get install libcurl4 cmake
# get git
sudo apt-get install git
# clone the repository with *RECURSIVE* for submodules
git clone --recursive https://github.com/bmegli/network-hardware-video-encoder.git

# finally build the library and examples
cd network-hardware-video-encoder
mkdir build
cd build
cmake ..
make

Running example

Stream procedurally generated H.264/HEVC video over UDP (moving through greyscale)

# Usage: ./nhve-stream-* <ip> <port> <seconds> [device]
./nhve-stream-h264 127.0.0.1 9766 10
./nhve-stream-hevc10 127.0.0.1 9766 10
./nhve-stream-multi 127.0.0.1 9766 10
./nhve-stream-h264-aux 127.0.0.1 9766 10

You may need to specify VAAPI device if you have more than one (e.g. NVIDIA GPU + Intel CPU).

# Usage: ./nhve-stream-* <ip> <port> <seconds> [device]
./nhve-stream-h264 127.0.0.1 9766 10 /dev/dri/renderD128 #or D129
./nhve-stream-hevc10 127.0.0.1 9766 10 /dev/dri/renderD128 #or D129
./nhve-stream-multi 127.0.0.1 9766 10 /dev/dri/renderD128 #or D129
./nhve-stream-h264-aux 127.0.0.1 9766 10 /dev/dri/renderD128 #or D129

If you don't have receiving end you will just see if hardware encoding worked.

If you get errors see also HVE troubleshooting.

Using

See examples directory for more complete and commented examples with error handling.

See HVE docs for details about hardware configuration.

//prepare library data
struct nhve_net_config net_config = {IP, PORT};
struct nhve_hw_config hw_config = {WIDTH, HEIGHT, FRAMERATE, DEVICE, ENCODER,
                                   PIXEL_FORMAT, PROFILE, BFRAMES, BITRATE,
                                   QP, GOP_SIZE, COMPRESSION_LEVEL, LOW_POWER};

//initialize single hardware encoder
struct nhve *streamer = nhve_init(&net_config, &hw_config, 1, 0);

struct nhve_frame frame = { 0 };

//later assuming PIXEL_FORMAT is "nv12" (you can use something else)

//fill with your stride (width including padding if any)
frame.linesize[0] = frame.linesize[1] = WIDTH;

//...
//whatever logic you have to prepare data source
//..

while(keep_streaming)
{
	//...
	//update NV12 Y and color data (e.g. get them from camera)
	//...

	//fill nhve_frame with increasing framenumber and
	//pointers to your data in NV12 pixel format
	frame.data[0]=Y; //dummy luminance plane
	frame.data[1]=color; //dummy UV plane
	
	//encode and send this frame
	if( nhve_send(streamer, &frame, 0) != NHVE_OK)
		break; //break on error
}

//flush the streamer by sending NULL frame
nhve_send(streamer, NULL, 0);

nhve_close(streamer);

That's it! You have just seen all the functions and data types in the library.

The same interface works for multi-frame streaming with:

  • array of hardware configurations in nhve_init
  • nhve_send(streamer, &frame0, 0), nhve_send(streamer, &frame1, 1), ...

The same interface works for non-video (raw) data streaming with:

  • number of auxiliary channels in nhve_init
  • nhve_send with frame.data[0] of size frame.linesize[0] raw data

Compiling your code

IDE (recommended)

The simplest way is to copy headers and sources of HVE, MLSP and NHVE to your project and link with avcodec, avutil and avfilter.

CMake

See realsense-network-hardware-video-encoder as example.

License

Library and my dependencies are licensed under Mozilla Public License, v. 2.0

This is similiar to LGPL but more permissive:

  • you can use it as LGPL in prioprietrary software
  • unlike LGPL you may compile it statically with your code

Like in LGPL, if you modify this library, you have to make your changes available. Making a github fork of the library with your changes satisfies those requirements perfectly.

Since you are linking to FFmpeg libraries consider also avcodec, avutil and avfilter licensing.

Additional information

Library uses

Realsense D400 infrared/color H.264 and infrared/color/depth HEVC streaming - realsense-network-hardware-video-encoder

network-hardware-video-encoder's People

Contributors

bmegli avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar

network-hardware-video-encoder's Issues

framenumber not handled correctly when flushing encoder

Flushing encoder may result (e.g. with B frames > 0) in producing multiple frames.

The hardware is flushed correctly but In the current implementations all those frames would be sent with the same framenumber.

There are many solutions to this problem:

  • only send last frame produced with flushing
  • increase framenumber in the function
  • completely hide framenumber concept from the user

handle framenumber internally

Library user shouldn't be tortured with internal library concept anyway.

All his business should be to supply new data.

Related to #7

Force keyframe in streaming

Hi @bmegli
Thank you for helping out.
We were testing h264 streaming. It generally performs quite well in a good network.
But what we found in an unreliable network, is that the quality of the video does not recover and also the receiver has to start before the streamer to get the keyframe.

I wonder if there is any way we can force the streamer to send keyframe periodically in order to recover even when dropping packets or restarting the receiver?

Thanks.
Chang

add example index

Now that there are 4 examples some kind of index is necessary (e.g. readme.md)

Bandwidth utilisation not adjustable

Hi @bmegli

I was testing the performance of the encoding with different bandwidth parameters

the initilisation I used was:

hw_config_.pixel_format = "nv12";
hw_config_.profile = FF_PROFILE_H264_CONSTRAINED_BASELINE;
hw_config_.encoder = NULL;
hw_config_.width = 1280;
hw_config_.height= 720;
hw_config_.framerate = 30;
hw_config_.max_b_frames = 0;
hw_config_.device = NULL; //NULL as last argv argument, or device path
hw_config_.bit_rate = 4000000;

However, the traffic monitoring in linux only showing 1.8-2 mbps. I have checked different bit_rate parameters (10,000,000 and 20,000,000) but not really made any difference. Did I reached some limitations of the hardware? (not sure if the h264 is able to handle 1280x720 resolution.

Also I did have two NHVE initialised with different bit rate and different port. would that make a difference?

Best Regards,
Chang

various encoders support

This is natural extension after implementing this in HVE

This means support for:

  • mjpeg
  • mpeg2
  • vp8
  • vp9
  • hevc

Implementation is rather straightforward:

  • bump HVE submodule
  • extend hardware configuration as in HVE

Related to hardware-video-streaming#1

error compiling in ubuntu 16.04

Great package!

I have installed the ffmpeg-3.4.1 from source
and installed all the rest dependencies via apt install.

However, I still get the following error:

[  7%] Building C object minimal-latency-streaming-protocol/CMakeFiles/mlsp.dir/mlsp.c.o
[ 14%] Linking C static library libmlsp.a
[ 14%] Built target mlsp
[ 21%] Building C object hardware-video-encoder/CMakeFiles/hve.dir/hve.c.o
[ 28%] Linking C shared library libhve.so
[ 28%] Built target hve
[ 35%] Building C object CMakeFiles/nhve.dir/nhve.c.o
[ 42%] Linking C static library libnhve.a
[ 42%] Built target nhve
[ 50%] Building C object CMakeFiles/nhve-stream-h264.dir/examples/nhve_stream_h264.c.o
[ 57%] Linking C executable nhve-stream-h264
hardware-video-encoder/libhve.so: undefined reference to `av_hwframe_get_buffer'
hardware-video-encoder/libhve.so: undefined reference to `av_hwframe_ctx_init'
hardware-video-encoder/libhve.so: undefined reference to `av_hwframe_ctx_alloc'
hardware-video-encoder/libhve.so: undefined reference to `av_hwframe_transfer_data'
hardware-video-encoder/libhve.so: undefined reference to `avcodec_send_frame'
hardware-video-encoder/libhve.so: undefined reference to `av_hwdevice_ctx_create'
hardware-video-encoder/libhve.so: undefined reference to `avcodec_receive_packet'
collect2: error: ld returned 1 exit status
CMakeFiles/nhve-stream-h264.dir/build.make:97: recipe for target 'nhve-stream-h264' failed
make[2]: *** [nhve-stream-h264] Error 1
CMakeFiles/Makefile2:69: recipe for target 'CMakeFiles/nhve-stream-h264.dir/all' failed
make[1]: *** [CMakeFiles/nhve-stream-h264.dir/all] Error 2
Makefile:127: recipe for target 'all' failed
make: *** [all] Error 2

Please help.

Best Regards,
Chang

Stream opencv images cv::mat

Hi @bmegli
I'm trying to stream standard opencv cv::mat data through h264.

hw_config:

hw_config_.pixel_format = "nv12";
hw_config_.profile = FF_PROFILE_H264_CONSTRAINED_BASELINE; 
hw_config_.encoder = "h264_vaapi";
hw_config_.width = 640;
hw_config_.height= 480;
hw_config_.framerate = 30;
hw_config_.device = "/dev/dri/renderD128"

When streaming, i use the following

nhve_frame frame = {0};

frame.linesize[0] =  hw_config_.width;
frame.data[0] = y.ptr(); // y is CV_8UC1, 640x480

frame.linesize[1] =  hw_config_.width;//hw_config_.width;
frame.data[1] = uv.ptr();// uv is CV_8UC1, 640x240

frame.framenumber = framenumber_;
nhve_send_frame(streamer_, &frame);

It comes out as the following error:

corrupted size vs. prev_size

Are you able to spot anything wrong?

Thank you in advance.
Chang

stream RGB data

Hi @bmegli
Just wondering if you could share a script to stream rgb images?
I have the pixel format set to hw_config_.pixel_format = "bgr0";

and trying to stream with the following code

	static uint16_t framenumber_ = 0;
	struct nhve_frame frame = { 0 };
	frame.linesize[0] = color.step[0];

	frame.data[0] = (uint8_t*)color.ptr();
	frame.framenumber = framenumber_;
	
	if (nhve_send_frame(streamer_color_,&frame)) {
		NODELET_ERROR ("nhve_send_frame failed to send"); //break on error
		return 1;
	}

	framenumber_++;

and also tried

cv::Mat bgr[3];   //destination array
	cv::split(color,bgr);//split source  
	static uint16_t framenumber_ = 0;
	struct nhve_frame frame = { 0 };
	frame.linesize[0] = frame.linesize[1] = frame.linesize[2] = bgr[0].step[0];

	frame.data[0] = bgr[0].ptr();
	frame.data[1] = bgr[1].ptr();
	frame.data[2] = bgr[2].ptr();
	frame.framenumber = framenumber_;
	
	if (nhve_send_frame(streamer_color_,&frame)) {
		NODELET_ERROR ("nhve_send_frame failed to send"); //break on error
		return 1;
	}

	framenumber_++;

It compiles fine but always come up with this runtime error:

[AVHWFramesContext @ 0x55a324ca6ba0] Map surface 0x4000013.
Assertion abs(src_linesize) >= bytewidth failed at src/libavutil/imgutils.c:313
 05/05 00:32:40,334 ERROR [140507744708352] (types.h:307) get_device_time_ms() took too long (more then 2 mSecs)

please advice the possible solution.

Thank you in advance.

Chang

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.