A site for solving at least some of your technical problems...
A site for solving at least some of your technical problems...
NVidia provides a neat graphical interface that allows you to see and change various values that the GPU makes available.
This interface is opened when you start nvidia-settings.
The problem is that at times you probably would like to set a parameter to a value as defined in that interface. The cool thing is that the nvidia-settings tool actually lets you do that.
How that works exactly, you can use the --help command line option for details. The main two are -a to assign a new value and -q to list existing values. The values are defined per GPU, per screen, per display (monitor), and a few other items.
Note that some parameters are not going to be available on all devices. That is, it's certainly available on the device (i.e. all chips are very similar) but the driver interface may not give you access to that info. So how much the video encoder is used, may not be available. The temperature may not be available. Etc.
On my end, I wanted to make sure that my code was indeed using the GPU to decode MP4 videos so I could use:
nvidia-settings -q VideoEncoderUtilization -q VideoDecoderUtilization
That shows how much the encoder and the decoder are used.
Since I am only decoding, the encoder remains at 0%. The Decoder, however, jumps to around 20% to decompress a 4K video.
The problem I have here, though, is that those two parameters are not available on a Jetson AGX Xavier card. Quite unfortunate.
Another tool available on the tegra cars, such as the Jetsons, is tegrastats:
sudo tegrastats
The output is pretty technical, the docs are here now.