|
Hi folks, I'm looking to upgrade my GPU (AMD Radeon R7 360), which is somewhat antiquated. I need a card that can be utilised to work with a 4k HDR 10 plus 55 inch TV. This is where I'm getting confused with refresh rates, HDMI 2.0b etc.
My set up is as follows:
R7 360
Windows 8.1 (64 bit),
GDDR5 16 gig ram,
CPU Type - AMD FX(tm)-6350 Six-Core Processor,
MB Asus M5A97 Evo R2.0
PSU 550 Watts
My GPU is connected with HDMI 2.0 via an active display adapter, running into a denon x2300w AVR, then to TV (Samsung 55 inch 4k HDR10 1900 pqi).
As it's an old GPU, with 1.4 ports, I can only get 4k @30fps, terrestrial digital TV will play fine at 60hertz, however the TV will NOT turn on the HDR, unless the frame rate is dropped at least 25hertz, making it very choppy. This is the same with web streaming, I have to use 25 hertz in order for the HDR to come on (which vastly improves the quality).
So am I right in saying that upgrading the GPU which supports HDR 10, I won't have to drop to 25 hertz?(30 hertz will not turn on the HDR, it has to be 25 at least, TV came out 1 year ago, it's well up to date).
I've been looking at Radeon RX Vega 56, which claims HDR support.
Any suggestions, or corrections if my thinking is misinformed, would greatly be appreciated.
G |
|