2023-07-01, 07:31
Long time user, new forum account
The current issue I am working on is making everything in my desktop setup play nicely with my LG C2 television (model OLED42C2PUA)
This is a pretty sweet TV with an OLED panel that operates a full 4K DolbyVision picture @ 120Hz.
Of course, 120Hz is pretty new so I believe I may be running into issues with my GPU talking to the TV and properly negotiating capabilities.
I have a Sapphire NITRO+ Radeon RX 5700 XT 8GB for GPU at the moment. It is self-reporting HDMI version 2.0, so the connection to the TV is simply not fast enough to do 4K+HDR+120FPS.
I have verified all cabling is HDMI 2+ capable using an Xbox Series X (which is new enough to output the full bandwidth signal, unlike my PC).
Ideally, I like to just leave Windows driving this TV at the highest quality 60 HZ signal possible at all times if I can. I'm not honestly sure what that would be, given the panoply of color formats and bit depths and stuff available to use.
Currently, if I have HDR turned on with my LG TV and I launch Kodi, it seems to not understand that it's in HDR mode. If I play a video file with HDR10 (BT.2020 colorspace), the first run of the file results in super blown-out video output until I hit stop.
When the player is stopped, Kodi then switches Windows' HDR mode off for the TV. If I re-run the same video file a second time without doing anything else, Kodi trips Windows' HDR mode back On and plays the video file, which then displays correctly.
I'm feeling especially confused about the option "Use 10 bit for SDR" found in Settings>System>Display and how that comes into this (if at all).
When it comes to SDR playback, if I'm looking for "best visual quality", would I want the higher quality 10-bit video surface going out to a display which has HDR turned on?
Also, along similar lines, since I like to make Kodi look fancy, wouldn't I want to run the menu / library stuff in HDR10 if possible?
Hopefully I haven't missed something super obvious regarding this stuff 🤷
The current issue I am working on is making everything in my desktop setup play nicely with my LG C2 television (model OLED42C2PUA)
This is a pretty sweet TV with an OLED panel that operates a full 4K DolbyVision picture @ 120Hz.
Of course, 120Hz is pretty new so I believe I may be running into issues with my GPU talking to the TV and properly negotiating capabilities.
I have a Sapphire NITRO+ Radeon RX 5700 XT 8GB for GPU at the moment. It is self-reporting HDMI version 2.0, so the connection to the TV is simply not fast enough to do 4K+HDR+120FPS.
I have verified all cabling is HDMI 2+ capable using an Xbox Series X (which is new enough to output the full bandwidth signal, unlike my PC).
Ideally, I like to just leave Windows driving this TV at the highest quality 60 HZ signal possible at all times if I can. I'm not honestly sure what that would be, given the panoply of color formats and bit depths and stuff available to use.
Currently, if I have HDR turned on with my LG TV and I launch Kodi, it seems to not understand that it's in HDR mode. If I play a video file with HDR10 (BT.2020 colorspace), the first run of the file results in super blown-out video output until I hit stop.
When the player is stopped, Kodi then switches Windows' HDR mode off for the TV. If I re-run the same video file a second time without doing anything else, Kodi trips Windows' HDR mode back On and plays the video file, which then displays correctly.
I'm feeling especially confused about the option "Use 10 bit for SDR" found in Settings>System>Display and how that comes into this (if at all).
When it comes to SDR playback, if I'm looking for "best visual quality", would I want the higher quality 10-bit video surface going out to a display which has HDR turned on?
Also, along similar lines, since I like to make Kodi look fancy, wouldn't I want to run the menu / library stuff in HDR10 if possible?
Hopefully I haven't missed something super obvious regarding this stuff 🤷