|
For the avoidance of doubt, a data network transport beit Wi-Fi, ethernet, DSL, or whatever does not effect the "quality" of anything it conveys - it's all just "bits" and they get transported entirely intact or not at all. It's not like the days of (for example) old analogue TV whereby the signalling conditions could worsen and worsen gradually yielding a picture with more and more "snow" on it until you couldn't see anything. Digital transmission is either perfect, or completely absent - there's nothing in between. If you get lost packets, that tends to manifest itself as glitches and/or "pops" in the playback, or the dreaded "buffering" not "quality" loss.
As others have indicated, it may be that the technology in the end stations is capable of assessing the performance of the end-to-end pathway between source and sink stations and "throttling" the transmission rates or reducing quality to cope. But that's entirely a function of the source and sink devices (if they do it at all,) not something the network transport implements.
Thusly, I wouldn't get hung up on whether (for example) 2.4GHz or 5GHz somehow effects your playback quality - in and of themselves, it doesn't make any difference - it's all about the bandwidth across the channel, and even if the bandwidth is a problem, Wi-Fi itself won't change the "quality" of the stream, it would be up to the source/sink endstations to do so. Things like BBC iPlayer and Youtube do this for example, but the static text pages on web servers (and pretty much everything else) doesn't bother. |
|