I'm not too familiar with the details of video processing, but I'd like to share some performance observations from my practice.
CPU Usage for Decoding
My test involves pulling a single original video stream, doing some intermediate processing (consuming about 50% of CPU), then re-encoding and pushing it to an RTMP server(about 12% of CPU, using GPU encoding). I then watch the final processed live stream.
Here's my setup:
- CPU: Intel Xeon Gold 5118 2.30GHz x8
- GPU: Nvidia Tesla V100 32GB
- Origin Video: 1080p 24fps, 4Mbps h264-baseline
The CPU usage below are not exact measurements, they are merely my intuitive perception from observing htop
.
Decoding Scenario |
Min |
Avg |
Max |
Single stream with CPU |
18% |
22% |
30% |
Single stream with GPU |
20% |
23% |
25% |
When it comes to 6-stream parallel decoding:
- With GPU: CPU usage stabilizes at nearly 100% across all 8 cores, and the resulting video stream is almost smooth.
- With CPU: The total CPU usage fluctuates between 40% and 80%, but the video is more stuttered compared to using GPU.
In my case, I might opt for the GPU solution as it appears more stable, although it seems not so friendly to energy efficiency.