Stumbled upon a few Github Gists that would be helpful if you use KSysGuard:
nvidia-gpu-sensor.pl
: https://gist.github.com/Sporif/4ce63f7b6eea691bdbb18905a9589169
RAW, in case the user or script ever gets deleted:
#!/usr/bin/perl -w
# act as a KSysGuard sensor
# provides NVIDIA GPU info via `nvidia-smi`
# Usage:
# 1. Save this script, make it executable and move it to a directory in your $PATH
# 2. Save this ksysguard sensor file for Nvidia: https://gist.github.com/Sporif/31f0d8d9efc3315752aa4031f7080d79
# 2. In KSysGuard's menu, open "File > Import Tab From File option"
# 3. Open the sensor file (nvidia.srgd)
# See Also
# https://techbase.kde.org/Development/Tutorials/Sensors
$|=1;
print "ksysguardd 1.2.0\n";
print "ksysguardd> ";
while(<>){
if(/monitors/){
print "gpu_temp\tinteger\n";
print "gpu_fan_speed\tinteger\n";
print "gpu_core_usage\tinteger\n";
print "gpu_core_clock\tinteger\n";
print "gpu_mem_mib\tinteger\n";
print "gpu_mem_clock\tinteger\n";
print "gpu_video_decode\tinteger\n";
print "gpu_video_encode\tinteger\n";
}
if(/gpu_temp/){
if(/\?/){
print "GPU Temp\t0\t100\t°C\n";
}else{
print `nvidia-smi --query-gpu=temperature.gpu --format=csv,noheader`;
}
}
if(/gpu_fan_speed/){
if(/\?/){
print "GPU Fan Speed\t0\t100\t%\n";
}else{
print `nvidia-smi --query-gpu=fan.speed --format=csv,noheader,nounits`;
}
}
if(/gpu_core_usage/){
if(/\?/){
print "GPU Core Usage\t0\t100\t%\n";
}else{
print `nvidia-smi --query-gpu=utilization.gpu --format=csv,noheader,nounits`;
}
}
if(/gpu_core_clock/){
if(/\?/){
print "GPU Core Clock\t0\t2500\tMhz\n";
}else{
print `nvidia-smi --query-gpu=clocks.current.graphics --format=csv,noheader,nounits`;
}
}
if(/gpu_mem_mib/){
if(/\?/){
print "GPU Memory Usage\t0\t".`nvidia-smi --query-gpu=memory.total --format=csv,noheader,nounits | perl -pe 'chomp'`."\tMiB\n";
}else{
print `nvidia-smi --query-gpu=memory.used --format=csv,noheader,nounits`;
}
}
if(/gpu_mem_clock/){
if(/\?/){
print "GPU Memory Clock\t0\t8000\tMhz\n";
}else{
print `nvidia-smi --query-gpu=clocks.current.memory --format=csv,noheader,nounits`;
}
}
if(/gpu_video_decode/){
if(/\?/){
print "GPU Decoding\t0\t100\t%\n";
}else{
print `nvidia-smi dmon -c 1 -s u | sed s/#// | awk '{print \$5}' | tail -n 1`;
}
}
if(/gpu_video_encode/){
if(/\?/){
print "GPU Encoding\t0\t100\t%\n";
}else{
print `nvidia-smi dmon -c 1 -s u | sed s/#// | awk '{print \$4}' | tail -n 1`;
}
}
print "ksysguardd> ";
}
There’s an error in the original script where a line break in the input wasn’t sanitized so this patch should correct it:
--- nvidia-gpu-sensor.pl 2019-02-22 14:25:25.337646000 -0800
+++ nvidia-gpu-sensor-fixed.pl 2019-02-22 21:46:22.872704984 -0800
@@ -13,6 +13,8 @@
# https://techbase.kde.org/Development/Tutorials/Sensors
$|=1;
+$MEM = "".`nvidia-smi -i 0 --query-gpu=memory.total --format=csv,noheader,nounits`."";
+chomp $MEM;
print "ksysguardd 1.2.0\n";
print "ksysguardd> ";
@@ -58,7 +60,7 @@
}
if(/gpu_mem_mib/){
if(/\?/){
- print "GPU Memory Usage\t0\t".`nvidia-smi --query-gpu=memory.total --format=csv,noheader,nounits | perl -pe 'chomp'`."\tMiB\n";
+ print "GPU Memory Usage\t0\t$MEM\tMiB\n";
}else{
print `nvidia-smi --query-gpu=memory.used --format=csv,noheader,nounits`;
}
And the 2nd file is this nvidia.sgml
you import into KSysGuard: https://gist.github.com/Sporif/31f0d8d9efc3315752aa4031f7080d79
RAW, in case the user or script ever gets deleted:
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE KSysGuardWorkSheet>
<WorkSheet title="Nvidia" columns="2" rows="4" locked="0" interval="1">
<host name="127.0.0.1" port="-1" shell="" command="nvidia-gpu-sensor.pl"/>
<display labels="1" svgBackground="" vScroll="1" version="1" class="FancyPlotter" vLines="1" unit="" row="0" hScale="3" autoRange="1" fontSize="8" manualRange="0" rowSpan="1" columnSpan="1" showUnit="0" vDistance="30" column="0" stacked="0" title="GPU Temp" hLines="1">
<beam sensorName="gpu_temp" sensorType="integer" color="0xff0057ae" hostName="127.0.0.1"/>
</display>
<display labels="1" svgBackground="" vScroll="1" version="1" class="FancyPlotter" vLines="1" unit="" row="0" hScale="3" autoRange="1" fontSize="8" manualRange="0" rowSpan="1" columnSpan="1" showUnit="0" vDistance="30" column="1" stacked="0" title="GPU Fan Speed" hLines="1">
<beam sensorName="gpu_fan_speed" sensorType="integer" color="0xff0057ae" hostName="127.0.0.1"/>
</display>
<display labels="1" svgBackground="" vScroll="1" version="1" class="FancyPlotter" vLines="1" unit="" row="1" hScale="3" autoRange="1" fontSize="8" manualRange="0" rowSpan="1" columnSpan="1" showUnit="0" vDistance="30" column="0" stacked="0" title="GPU Core Usage" hLines="1">
<beam sensorName="gpu_core_usage" sensorType="integer" color="0xff0057ae" hostName="127.0.0.1"/>
</display>
<display labels="1" svgBackground="" vScroll="1" version="1" class="FancyPlotter" vLines="1" unit="" row="1" hScale="3" autoRange="1" fontSize="8" manualRange="0" rowSpan="1" columnSpan="1" showUnit="0" vDistance="30" column="1" stacked="0" title="GPU Core Clock" hLines="1">
<beam sensorName="gpu_core_clock" sensorType="integer" color="0xff0057ae" hostName="127.0.0.1"/>
</display>
<display labels="1" svgBackground="" vScroll="0" version="1" class="FancyPlotter" vLines="0" unit="" row="2" hScale="3" autoRange="1" fontSize="8" manualRange="0" rowSpan="1" columnSpan="1" showUnit="0" vDistance="30" column="0" stacked="0" title="GPU Memory Usage" hLines="1">
<beam sensorName="gpu_mem_mib" sensorType="integer" color="0xff0057ae" hostName="127.0.0.1"/>
</display>
<display labels="1" svgBackground="" vScroll="0" version="1" class="FancyPlotter" vLines="0" unit="" row="2" hScale="3" autoRange="1" fontSize="8" manualRange="0" rowSpan="1" columnSpan="1" showUnit="0" vDistance="30" column="1" stacked="0" title="GPU Memory Clock" hLines="1">
<beam sensorName="gpu_mem_clock" sensorType="integer" color="0xff0057ae" hostName="127.0.0.1"/>
</display>
<display labels="1" svgBackground="" vScroll="1" version="1" class="FancyPlotter" vLines="1" unit="" row="3" hScale="3" autoRange="1" fontSize="8" manualRange="0" rowSpan="1" columnSpan="1" showUnit="0" vDistance="30" column="0" stacked="0" title="GPU Video Decode" hLines="1">
<beam sensorName="gpu_video_decode" sensorType="integer" color="0xff0057ae" hostName="127.0.0.1"/>
</display>
<display labels="1" svgBackground="" vScroll="1" version="1" class="FancyPlotter" vLines="1" unit="" row="3" hScale="3" autoRange="1" fontSize="8" manualRange="0" rowSpan="1" columnSpan="1" showUnit="0" vDistance="30" column="1" stacked="0" title="GPU Video Encode" hLines="1">
<beam sensorName="gpu_video_encode" sensorType="integer" color="0xff0057ae" hostName="127.0.0.1"/>
</display>
</WorkSheet>
Usage:
Put nvidia-gpu-sensor.pl
in $HOME/.local/bin
(Ubuntu. For other distros, it’s where $PATH is)
Make it executable: chmod +x $HOME/.local/bin/nvidia-gpu-sensor.pl
Import the .sgrd from the Gist into KSysGuard. Without importing the file, the Perl script won’t load.
DONE!
Gotta say, this is pretty elegant for those wanting GPU monitoring out of KSysGuard using nvidia-smi