Varsas Publish time 2-12-2019 03:41:01

They did used to, I had (I think this is the correct way around) and ATI which would take an external audio signal in (e.g. from the soundard) and then push it out via HDMI, my next card (an Nvidia) actually showed up as a separate soundcard for audio via HDMI.That wasn't a particularly helpful way of doing things (the ATI solution was much better) and caused a few issues.

Not sure what modern graphics cards do.

See the section 'HDMI audio'
How do I setup my NVIDIA based graphics card to work with my HDTV? | NVIDIA

EndlessWaves Publish time 2-12-2019 03:41:01

They both initially has S/PDIF inputs but ATI/AMD only had it for a few models before swapping to a windows audio device, while nVidia retained it much longer. Modern cards use the windows audio device solution.

I'd expect internal S/PDIF to have been PCM though, as it was originally designed to get audio from the CD drive to the system so there probably wasn't any format conversion going on there.

Darko Publish time 2-12-2019 03:41:02

I had a similar issue with my HTPC and the build was up for the job (ryzen 16bg ram and a gt1030) no matter what I did I had a noisy picture and the sound was out of sync.
After messing around for a while I downloaded pot player and I now have 4K at 60, and no noise in picture or sound.
Odd really as I can’t see why VLC did not have the codec to support , either way it might be worth a go before you shell out.

Cheers Darko

THX1138UK Publish time 2-12-2019 03:41:03

For 2160p (4K UHD), you only need 60 Hz (59.9 Hz). The Intel 620 integrated Graphics built into (some) Core i5 processors, is perfectly sufficient for 4K 60 Hz output for movies and TV (it supports all frame rates in popular use).

The only real requirement for 4K UHD is a modern(is) CPU with hardware H.265 encoding built it. I have an Intel Core i5-6200U which is a Kabe-Lake variant and I use that as a media PC for 4K UHD content. I just use the Intel built-in Graphics (it's a laptop) and it works perfectly.

You say you have a 3 GHz Intel Core i5, but you don't say which model. How you tried using the built in Graphics, rather than your Graphics card? If it's an older model, it will only support H.264 hardware decoding, which isn't fast enough for 4K decoding.

The nVidia GTX1650 is a great graphics card, but its complete overkill for your needs.

HDCP and DisplayAudio can sometimes be problematic on PCs. The Video Playback application that you use can also be significant. Although I don't like the built in video app in Windows 10 - it does actually work well with 4K content.

Is HDR something you are interested in? - This can be problematic as it's not always available via PC HDMI output (this is an issue for the Intel 620 integrated graphics), and you may find HDR is only available via DisplayPort. If the Monitor / TV doesn't have a DisplayPort input, then you will need an ACTIVE (not standard passive) DisplayPort converter such as the CAC-1080 2.0b edition made by Club3D to make HDR work.

If you're current Intel Core i5 CPU doesn't support 4K UHD, then you may want to consider replacing it with a processor that does (you may also need a new motherboard), this maybe a better option than than replacing your graphics card.


Regards,
James.

Darko Publish time 2-12-2019 03:41:04

Off the back of this I had a early i5 (2500k) this would not play 4K .


Cheers Darko

Darren Heal Publish time 2-12-2019 03:41:05

The GTX1650 graphics card was fitted yesterday afternoon and all the problems with noise in the sound have gone.Which is why I wanted to change the card to begin with.New card will pump DTS:NeuralX and Atmos as well as all the legacy codecs.Still playing refresh rates but so far so good at 3840 x 2160 too.

Darren Heal Publish time 2-12-2019 03:41:05

BTW the receivers in question are a Marantz 1608 downstairs in the living room and a Marantz 7010 upstairs in the media room.TVs are a Samsung 8-series 55 inch in the living room and an LG 86UK6570PUB 86 inch in the movie room.AFAIK all support HDR10 and HDMI 2.0 (not sure about 2.0a).Both HTPCs seem to run fine with 8 gigs of RAM but I might up that to 16 just to be sure when my pocket money allows.RAM is, after all, " as cheap as chips" (pun intended) these days.

Coulson Publish time 2-12-2019 03:41:07

Are you sure the 950,60 and 80 do? Maybe they work better but those cards came out way before H265 so I don't see how that is possible? You maybe talking about H264?

Varsas Publish time 2-12-2019 03:41:07

If the TV can receive HDR from an external source (e.g. 4k bluray player) they are 2.0a. HDR TV's after 2016 should be fine, and some before that.An HDR TV that can't actually receive an HDR signal sounds odd but they do exist!If they are not 2.0a I believe you won't be able to send them an HDR signal from your computer, which will also need to support 2.0a.

richardsim7 Publish time 2-12-2019 03:41:08

Nope, they definitely have an onboard chip to decode HEVC, which the 970 does not for some bizarre reason
Pages: 1 2 3 [4] 5
View full version: New Graphics Card?