|
I just bought a HDR10 TV that I have my media/gaming PC plugged into. I am familiar with all the color settings and some look great and others not. I would like to know what the recommended settings for video output to the TV are.
1) RGB (16-235) limited 8bit (doesn't look amazing)
2) RGB (0-255) Full 8bit (looks the best in my opinion, but it isn't 10bit and when set to HDR, windows reports HDR mode 8-bit dither)
3) 4.2.2 (16-235) limited 10bit (From my understanding, this is what true HDR content uses, but the picture is not very pleasing on the desktop and it seems the limited color range is graying blacks and whites aren't as bright.)
In games, I notice a difference where the RGB Full range looks the best. In movies, I don't really notice a huge difference between RGB full 8-bit dither and 4.2.2 limited 10bit
I don't like the idea that 4.2.2 shares chroma with other pixels to save bandwidth, but the difference is negligible in my eyes. My initial thought from these findings is to just set it in RGB full 8-bit dither, but I don't want to miss out on the full capabilities of HDR in movies or games that support it.
Does anyone have any input on recommended settings and why? |
|