• 1
  • 2
  • 3(current)
  • 4
  • 5
  • 8
[LINUX] CPU temperature incorrect in XBMC for Linux?
#31
CrashX Wrote:By the look of the code ... it is looking for integer and character ... %d %c format ...

Right, just like the wiki says Smile I take it everything works fine for you CrashX?
Reply
#32
Tried
Code:
<advancedsettings>
<gputempcommand>echo "$(nvidia-settings -tq gpuCoreTemp) C"</gputempcommand>
<cputempcommand>echo "$(sensors -u | grep "temp2_input"| awk '{print $2 }') C"</cputempcommand>
</advancedsettings>
Which also didn't work for either CPU or GPU, just shows a "?" in XBMC. The commands do return what looks to be proper output.
Code:
rodalpho@fiddler:~$ echo "$(sensors -u | grep "temp2_input"| awk '{print $2 }') C"
29.50 C
rodalpho@fiddler:~$ echo "$(nvidia-settings -tq gpuCoreTemp) C"
37 C

Edit: Also tried
Code:
rodalpho@fiddler:~$ echo "$(sensors -u | grep "temp2_input"| awk '{print $2 }' |awk '{printf("%d\n",$1 + 0.5);}') C"
44 C
which returns an integer with no decimal, and it didn't work either. Just get question marks for both CPU and GPU temps.

Edit2: OK, after updating to 17366, the following advancedsettings.xml actually works, however in XBMC the numbers are converted to fahrenheit! They show up as 104F and 88F for GPU and CPU, respectively, when I prefer celsius. When advancedsettings.xml is removed, I'm back to question marks. Very very weird.
Code:
<advancedsettings>
<gputempcommand>echo "$(nvidia-settings -tq gpuCoreTemp) C"</gputempcommand>
<cputempcommand>echo "$(sensors -u | grep "temp2_input"| awk '{print $2 }' |awk '{printf("%d\n",$1 + 0.5);}') C"</cputempcommand>
</advancedsettings>
Reply
#33
What is displayed in the GUI is determined by the region you have set.
Reply
#34
Fair enough. Just weird that it shows F after having specified C in the config file.
Reply
#35
You aren't specifying the display scale, you're specifying the scale of the input. All temperatures in XBMC are displayed on the same scale based on the region setting. This is how it has been for years.
Reply
#36
I tested it as well and all worked fine.
Reply
#37
Nvidia temp not working for me Sad

Code:
xbmc@xbmc:~$ nvidia-settings -tq gpuCoreTemp

ERROR: The control display is undefined; please run `nvidia-settings --help`
       for usage information.

xbmc@xbmc:~$

Installed with olympia (wiki) guide except SVN build.
XBMC from SVN-PPA (17700),
Nvidia driver:180.29
Reply
#38
queeup Wrote:Nvidia temp not working for me Sad

Code:
xbmc@xbmc:~$ nvidia-settings -tq gpuCoreTemp

ERROR: The control display is undefined; please run `nvidia-settings --help`
       for usage information.

xbmc@xbmc:~$

Installed with olympia (wiki) guide except SVN build.
XBMC from SVN-PPA (17700),
Nvidia driver:180.29

try this:
nvidia-settings -c :0 -tq GPUCoreTemp
Reply
#39
hmmm.
Code:
xbmc@xbmc:~$ nvidia-settings -c :0 -tq GPUCoreTemp -V

WARNING: Error querying attribute 'GPUCoreTemp' specified in query
         'GPUCoreTemp'; 'GPUCoreTemp' is not available on xbmc:0.0.

xbmc@xbmc:~$
Reply
#40
Hi! Will try to ask this question here, and hope you guys can help me.

I got a GIGABYTE MA78GM-S2H, and I my temp are also not correct displayed.

What do I have to install on my Ubuntu 8.10 to make it work?

A little guide, and I would be very happy.

Thanks

EDIT: I followed the guide http://www.lm-sensors.org/wiki/iwizard/1
but when I then try "sensors" only this is displayed, and that cant be right..

root@htpc-desktop:/home/htpc# sensors
k8temp-pci-00c3
Adapter: PCI adapter
Core0 Temp: +23.0°C
Core0 Temp: +20.0°C
Core1 Temp: +21.0°C
Core1 Temp: +27.0°C

/Söder
Reply
#41
soder Wrote:A little guide, and I would be very happy.

http://wiki.xbmc.org/?title=HOW-TO:_Inst...monitoring
Reply
#42
olympia Wrote:http://wiki.xbmc.org/?title=HOW-TO:_Inst...monitoring

Ok, thanks, but is this right?

Code:
htpc@htpc-desktop:~$ sensors -u
k8temp-pci-00c3
Adapter: PCI adapter
Core0 Temp:
  temp1_input: 29.00
Core0 Temp:
  temp2_input: 28.00
Core1 Temp:
  temp3_input: 30.00
Core1 Temp:
  temp4_input: 37.00

it8718-isa-0228
Adapter: ISA adapter
in0:
  in0_input: 1.22
  in0_min: 0.00
  in0_max: 4.08
  in0_alarm: 0.00
in1:
  in1_input: 1.95
  in1_min: 0.00
  in1_max: 4.08
  in1_alarm: 0.00
in2:
  in2_input: 3.28
  in2_min: 0.00
  in2_max: 4.08
  in2_alarm: 0.00
in3:
  in3_input: 2.94
  in3_min: 0.00
  in3_max: 4.08
  in3_alarm: 0.00
in4:
  in4_input: 3.07
  in4_min: 0.00
  in4_max: 4.08
  in4_alarm: 0.00
in5:
  in5_input: 3.17
  in5_min: 0.00
  in5_max: 4.08
  in5_alarm: 0.00
in6:
  in6_input: 4.08
  in6_min: 0.00
  in6_max: 4.08
  in6_alarm: 1.00
in7:
  in7_input: 3.28
  in7_min: 0.00
  in7_max: 4.08
  in7_alarm: 0.00
in8:
  in8_input: 3.12
fan1:
  fan1_input: 0.00
  fan1_min: 0.00
  fan1_alarm: 0.00
fan2:
  fan2_input: 0.00
  fan2_min: 0.00
  fan2_alarm: 0.00
fan3:
  fan3_input: 0.00
  fan3_min: 0.00
  fan3_alarm: 0.00
temp1:
  temp1_input: 46.00
  temp1_max: 127.00
  temp1_min: 127.00
  temp1_alarm: 0.00
  temp1_type: 2.00
temp2:
  temp2_input: 49.00
  temp2_max: 80.00
  temp2_min: 127.00
  temp2_alarm: 0.00
  temp2_type: 3.00
temp3:
  temp3_input: 92.00
  temp3_max: 127.00
  temp3_min: 127.00
  temp3_alarm: 0.00
  temp3_type: 2.00
cpu0_vid:
  cpu0_vid: 1.55

I have a AMD 4850 cpu..

EDIT: Still only ? in XBMC...

/Söder
Reply
#43
Thanks for the guide, much appreciated! Will be following it shortly Smile
Openelec Gotham, MCE remote(s), Intel i3 NUC, DVDs fed from unRAID cataloged by DVD Profiler. HD-DVD encoded with Handbrake to x.264. Yamaha receiver(s)
Reply
#44
I can hereby confirm that
Code:
<advancedsettings>
<gputempcommand>echo "$(nvidia-settings -tq gpuCoreTemp) C"</gputempcommand>
<cputempcommand>echo "$(sensors -u | grep "temp2_input"| awk '{print $2 }' |awk '{printf("%d\n",$1 + 0.5);}') C"</cputempcommand>
</advancedsettings>
shows correct gpu temp of my nvidia card. (thanks rodalpho)

CPU temp info does not display correctly though.
lm_sensors is installed correctly and the code added to advencedsettings.xml gives this output when entered in a terminal:
Code:
echo "$(sensors -u | grep "temp2_input"| awk '{print $2 }') C"
4.00
38.00 C

Could it be due to the fact that my cpu has 2 cores that it doesn know what to put?

Hopefully someone can shed a light on this.

Cheers, dafart
Reply
#45
The code expects an integer. Hack off the ".00" and it should be fine.
Reply
  • 1
  • 2
  • 3(current)
  • 4
  • 5
  • 8

Logout Mark Read Team Forum Stats Members Help
[LINUX] CPU temperature incorrect in XBMC for Linux?2