PC HDR Sparkles
Hi All,Does anyone know why I might be getting sporadic white sparkles when paying games in HDR or watching an HDR movie (MKV / M2TS).
My setup is PC > AVR > TV.
TV is a 2016 LG OLED C series, PC is high end 8086K RTX 2080TI and AVR is Yamaha RX-A2060. HDMI cables are certified as premium and are all only 1.5m long.
PC runs buttery smooth and temps are all within norms even under load and PSU is more than enough to cope.
RE7 and RE2 are the 2 games that minifest the sparkles the worst with movie playback very much hit or miss.
The sparkles are only visible on a dark / black background so with both RE games its very noticible. FC5 and SOTTR didn't apear to manifest any sparkles. They were both much brighter games so maybe I didn't notice them.
I have tried different cables, changing HDMI inputs / outputs on the TV / AVR and even connecting the PC directly to the TV and the problem persists. I have also tried my old 1080TI with the same results.
The only solution is to disable HDR in-game.
PS4 Pro and LG UP970 UHD player don't have this issue.
Any help would be greatly appreciated.
Thanks
C Do the sparkles follow the contours of objects in game (i.e. they're a computer issue) or are they overlaid on the screen (they could be a connection/TV issue)?
I take it you've put the PC back to stock speeds? Hi EndlessWaves
Thanks for the reply.
Its hard to describe so I have attached a photo to better explain.
The attached photo was taken from a 4k movie that was easier to capture with my camera rather than having to play a game one-handed until a scene demonstrated the same results but hopefully you should get the idea.
RE2 Remake and RE7 both exhibit this behaviour at points throughout the game. Not to the same extent as shown in the pic but enough to be annoying none the less.
Thanks. I missed the part about the movies, that suggests it's probably not a stress issue on a particular part of the GPU.
It might be worth trying the Intel integrated graphics to check whether it's an nVidia issue. On a PC its possible to exceed the 18gbps limit for HDMI 2.0 by having signal settings that exceed the spec.
For example 12 bit 4.4.4 HDR
See here
4K 60 4:4:4 HDR
What are your graphics signal settings whilst playing HDR content ? Will give that a try tonight after work. Does the integrated intel 630 graphics thats inside my 8086k support HDR?
Thx Thanks for your input Andy.
My standard graphics settings are set at Default within the Nvidia control panel and then flick to 4:2:0 10bit automatically (Windows) when an HDR signal is detected.
My TV does have a 'PC Input' function that can be assigned to any HDMI input but from what I have read this allows for a 4:4:4 signal to be accepted but when enabled it borks HDR input signals so I have never enabled it. I would assume that if I enabled 4:4:4 on the TV then selected that within the Nvidia control panel I would be making the situation worse.
I'm leaning towards your assessment pointing towards a bandwith issue so I might just need to bight the bullet and buy a couple of premium 48Gbs rated HDMI cables and see if they solve the problem.
Thanks Be careful there.
48G cables are “ Ultra “ certified not “ Premium” and most on sale to date are fakes.
The HDMI CTS document for Category 3 or Ultra certification was only released a couple of weeks ago so Genuine “ Ultra” rated cables may not be available For a while yet.
Make sure whatever you buy is carrying the official logo and QR code.
HDMI :: Manufacturer :: HDMI 1.4 :: Finding the Right Cable Yeah, they added it back in 2017, not long after Microsoft added support to Windows. Thanks for the heads up on that one. Didn't know that.
I'll stay well clear of Amazon and eBay and stick to more reputable AV dealers.
C
Pages:
[1]
2