|
As I say it depends on what functionality you want. If you just want USB hub functionality so you can plug devices into your screen and they'll connect to your computer then any screen with USB ports will provide that.
If you just want a single USB cable that outputs video to the monitor, connects USB devices and provides power to the laptop then you'll need a screen that supports them over USB Type C (Type A/B doesn't do proper video output).
But there are screens that have a Type C connector and just act as a USB Hub, so you need to make sure the USB connection does what you want even if it's using the Type C connector.
HDR is more contrast through making the highlights in the picture closer to their real life brightness, the aim being a more realistic picture. As there's been no breakthrough in contrast in LCD panels this has to be achieved by brightening or dimming the backlight behind in appropriate places, rather than having it remain a constant brightness like normal. This is known as local dimming.
The finer the control the screen has over the backlight the smaller the areas that can be dimmed and the wider the brightness range can be. A good backlight will be divided into hundreds of zones, each one relatively small to handle details like light sources and glints off metal.
All of the current HDR monitors only have around half a dozen zones, if that, so they're very limited in the contrast increase they can provide as large, bright areas are uncommon as HDR isn't about raising the overall picture brightness. There are a couple of models with 384 zones coming out this year, which should look much better - akin to a high end TV like a Sony XE94.
HDR is also being used as an opportunity to introduce a new colour space. It's very hard to introduce a new colour space piecemeal so although wide colour gamut technology has been cheaply available for a decade or more it's been primarily used on screens targeted at photographers and video editors. Things like the internet have pretty much ignored the technology, so many web pages still lack the information about their colour space and will display incorrectly on a wide gamut display.
HDR requires the display shift to a different mode because of the way it specifies brightness. By ensuring all HDR content is specified in a very wide colour space from the start wide gamut monitors can approximate normal gamut (sRGB) for everything non-HDR and end up with none of the oversaturation they currently suffer in most programs.
As well as the extremely limited hardware the other issue with current HDR monitors is that they only support the home cinema-focused HDR10 format. The issue with that format is that brightness values are fixed.
You'd normally view something like a web page at completely different brightnesses on your phone in summer and on your laptop in bed, and what works currently because the current standard defines the picture relative to the display's maximum brightness.
With fixed brightness you could either have a blinding bright website you couldn't look at in bed or a website so dim you couldn't see it during the day.
Rather than rely on a display to try and guess which parts of the picture need to be brightened it seems likely that applications such as the Web will adopt one of the relative brightness HDR standards such as HLG.
So something like the LG 32UD99-W is good screen by normal monitor standards, just be aware of its very limited support for HDR. If HDR is a driving force behind your purchase then I'd strongly consider waiting a couple of years for HDR hardware production to get going and the full set of standards to be decided on. |
|