How Bluetooth Takes Away the Wire
How has Bluetooth as a technology developed over the last decade or so for hi-fi?
The initial audio use case was not hi-fi – it was for mono earpieces (those that you used to see used by taxi drivers etc) and then the law was passed that you had to have some sort of hands-free device in your car, so sales of those earpieces went through the roof.
In terms of its acceptance in hi-fi, for use for stereo audio, that was more problematic – and it was problematic not because it was inherently bad, but because it was inherently unclear what you would get
It could be good, it could be bad, it could be anything in between. And that’s to do with the way the original mandatory audio coding system of Bluetooth was designed. It can be operated at a number of different bitrates: from very low, which would give you poor audio, to pretty high that would give you good quality audio – but you’d buy a product and you wouldn’t know what you were going to get, therefore brands whose credibility was built on audio quality initially resisted adopting Bluetooth.
So what changed that?
What changed that was actually our original aptX technology. Because aptX is fixed bitrate, it was designed to deliver very close to CD quality, and any product that features aptX goes through something called an IOP Test. That means that if it’s, for example, a speaker, it’s being tested with loads of phones and other things that might be transmitter. If it’s a phone or a transmitting device, it’s been tested with loads of headphones or speakers or things that might be used to play back.
The consumer can then see the logo and know what he or she is going to get, i.e. a reproduction close to CD quality, and the buyer can be certain that it’s going work because it has been tested, hence the use of the logo.
This is how aptX changed things. It opened up the market, a lot of high-end audio brands who wanted to offer something wireless as a choice to their customers felt for the first time that they could put Bluetooth in their products, and Cambridge Audio was one of those early first brands that did that.
How did aptX HD follow aptX?
By 2015 the requirement was for something higher quality, better than CD quality. High-res audio services were beginning to come online, hi-res content was available to download and people wanted something better.
We always had the ability to run aptX at 24-bit resolution – we’d been doing it in the broadcast area for a long time – and we were able to bring that to Bluetooth in 2016 with aptX HD. Again, Cambridge Audio saw the potential very early on, with aptX HD in a number of products including the flagship Edge series.
To most people and confirmed by a peer review at Salford University, aptX HD is indistinguishable from a hi resolution audio source. And at that point, you know that you’ve got a solution that should meet the needs of the most critical of consumers. There are always people who want more but that’s what we have now responded to with the launch of aptX Adaptive technology which is capable of reaching 24bit 96kHz resolution over Bluetooth.
What are the common misconceptions about wireless audio?
One of them is “lossless versus lossy” and the acceptance that lossy is inherently bad by definition and it doesn’t need to be.
Bluetooth is bitrate constrained and that’s its advantage in terms of putting it into small devices like Melomania earbuds, because you’ve got a small device with a super small battery, and you can’t run a high bit rate high power radio in ear. A personal audio product like an earbud needs to be low power. Bluetooth is designed from the ground up for low power, and that’s one of the reasons why it’s low bitrate. You’re not going to see wifi in a wireless earbud any time soon, because it would burn through the battery in no time and would risk presenting safety challenges.
With low power comes low bitrates, so that poses challenges. And an awful lot of our story of aptX is about solving those challenges.
Sometimes I get into discussions about the qualities of aptX purely on an audio performance basis versus something else and people are actually missing the point a bit. We are trying to deliver robust audio over Bluetooth and that puts limits on you. It’s partly bitrate, but it also puts on limits to do with co-existing with wifi. Bluetooth and Wi-Fi operates on the same radio frequency, so they can interfere with each other. Wifi is much higher in power, so you’ve got a heavyweight and a lightweight competing for the same attention, so it’s always going be Bluetooth that suffers from interferences.
So our story, is not just about audio quality of a Bluetooth – although if you haven’t got that, we believe you haven’t got anything, so we solve that first – but it’s solving all these other problems. Robustness, latency, co-existence with wifi these are the other chapters in that story.
Thanks for your time Guy!
ncG1vNJzZmign6zBsLDIrJquq6NjsLC5jq1moaenYq%2BtwcStpqismGLBorfErGSar5GuerW0xGauoqqVZIN5fJZw