NVENC actually uses a lot of GPU resources and watts

DayGeckoArt

Member
The premise of NVENC is to handle video encoding without impacting game performance much. There's a lot of discussion on the internet about the computer resources used by NVENC in various forums, blogs, Youtube videos, etc and the general consensus seems to be that it does use a bit of CPU and GPU computing power, and more if you use color encoding settings other than NV12, but not enough to worry about.

But I recently discovered by playing and recording an 15 year old game, World of Warcraft, that NVENC encoding uses quite a bit of CUDA computing power and watts! I have an RTX 2060S and I play at 4K with Vsync on limiting FPS to 60. OBS is set to 4:4:4 color and 30fps using NVENC_HEVC rc=constqp qp=30

With OBS open but not recording, GPU clock is 1200mhz and it draws about 70 watts. When recording, clock increases to 1905mhz and 140 watts. CUDA usage goes from 0% to about 35%. So that's 70 watts for just encoding!

I'm monitoring using HWInfo64 with Stream Deck plugin

Log:
 

Attachments

  • NO_REC.jpg
    NO_REC.jpg
    315.1 KB · Views: 400
  • NVENC.jpg
    NVENC.jpg
    320.7 KB · Views: 429
  • advanced_for_cuda_test.PNG
    advanced_for_cuda_test.PNG
    47.6 KB · Views: 427
  • settings_for_cuda_test.PNG
    settings_for_cuda_test.PNG
    56.7 KB · Views: 434

PaiSand

Active Member
Always check the analyzer. It gives you more hints and fixes than you think.

Also, in the log files says:
08:42:54.686: warning: This encoder is deprecated, use 'hevc_nvenc' instead
 

DayGeckoArt

Member
Always check the analyzer. It gives you more hints and fixes than you think.

Also, in the log files says:

Interesting but none of those things listed is a mistake. That's hilarious it says 444 is "the wrong color format" with a red "critical" icon!

I would choose HEVC_NVENC but it doesn't always appear as an option.... It didn't when I chose my encoder before but it does now!
Edit: Now I see, it shows up because I have the "show all codecs" option checked. HEVC_NVENC is apparently "potentially incompatible" with MKV containers
 
Last edited:

Harold

Active Member
The use of custom ffmpeg output mode has always been "when it breaks you're on your own"

Do you get similar results when you use simple output mode or the standard (not custom) encoder settings?
 

DayGeckoArt

Member
The use of custom ffmpeg output mode has always been "when it breaks you're on your own"

Do you get similar results when you use simple output mode or the standard (not custom) encoder settings?

I just tried simple with the defaults and CUDA is less at 20% with 120 watts. That makes sense because it's x264 instead of HEVC

I don't think anything is broken, NVENC just uses more resources than people realize... Usually it gets masked by the game using as much GPU power as it can, or by recording in only 1080p with NV12
 

koala

Active Member
The nvenc circuit needs power for encoding of course. It's a high performance specialized subprocessor within the GPU that uses power just as any other processing unit in the computer. If one says Nvenc is free, this only means it's free from using up computing resources that are used to render 3D graphics, not it's free from power consumption.
 

DayGeckoArt

Member
The nvenc circuit needs power for encoding of course. It's a high performance specialized subprocessor within the GPU that uses power just as any other processing unit in the computer. If one says Nvenc is free, this only means it's free from using up computing resources that are used to render 3D graphics, not it's free from power consumption.

Well it shows 30-40% CUDA usage. I believe the 70 watts reflects that CUDA processing which is 30-40%. The max power draw I see from the RTX 2060S is 180-190W so so it the numbers make sense. In a modern game like Battlefield 2042 the power draw is pegged at that 180-190W whether recording or not, but framerate drops signfiicantly when recording
 

DayGeckoArt

Member
What is also interesting is that my T400 Quadro card maxes out at 31 total watts recording + running a benchmark. HWInfo64 doesn't show CUDA usage for that card unfortunately. But it seems like NVENC is adaptable and will use the resources it can get?
 

DayGeckoArt

Member
It turns out the T400 does report CUDA usage but I had Hardware Accelerate GPU Scheduling (HAGS) enabled which prevents CUDA and some other metrics from being reported to the OS. I turned it off and tested by screen recording a 4K drone video with the same settings. GPU watts only go from 20W to 22W on that card! CUDA usage shows 13%
 

DayGeckoArt

Member
I just had another idea... I used my T400 PC to play back the same video of WoW I made on my gaming PC and screen recorded that. The numbers are surprisingly similar!! CUDA usage is 37%. GPU power goes up from 20W to 28W. This is interesting because the T400 has 384 CUDA cores vs 2176 CUDA cores on the RTX 2060S. So as a fraction of CUDA power available, it uses the same 1/3 roughly, and 8 watts.
 

Attachments

  • t400_norec.jpg
    t400_norec.jpg
    267.2 KB · Views: 151
  • t400_nvenc.jpg
    t400_nvenc.jpg
    267.9 KB · Views: 156

TryHD

Member
use preset p1 and yuv 4:2:0, with that cuda usage from encoding should be gone. Everything else of course still uses GPU recources.
 
Top