THE WONDERS OF WIRELESS!
How the evolution of Wi-Fi changed the world
HERE WE ARE AGAIN, unpacking a hopelessly complex series of standards in the pursuit of knowledge. We like to call it fun. For this issue’s outing, we’re looking at wireless networking. Not the frilly protocol level—that’s way too neat—but the hardware and signaling levels, the nuts and bolts that enable routers and adapters to communicate with each other, and do so now at enormous speeds.
While once you might have been thrilled to get 2,400 bits of data per second down your dial-up phone line—literally seeing the bytes of data appearing on the screen, taking agonizing seconds—now, single-digit megabytes per second speeds simply aren’t good enough for our wireless connections. So, we’re going to delve into the 802.11 wireless standard, explaining the basics of how digital data can be sent over radio frequencies with various modulations, and how each evolution of the standard introduced new enhancements and so greater throughput, alongside how knowing all of these esoteric details can enable you to get even more from your own networks.
We’ll literally start at “a” and end at “ax,” we’ll see how international standards helped create a unified worldwide system, how non-conforming releases broke the market for themselves, and how smart devices are improving reliability over raw speed. There is, of course, some head-splitting physics, but also a lot of common sense developments. So, let’s see how wireless went from the unreliable 1Mb/s slow lane to the madness of 9,600Mb/s throughput.
WIRELESS NETWORKS have been an utter revolution in the world of technology. Removing the need for any sort of wire to access networked data has propelled personal use, boosted businesses, and sparked entire new categories of devices, including the “Insecure-net Of Things”— we’re not apologizing for that, it’s true—among many others. Some of us old folk have been around long enough to remember coaxial-based 10Base2 networks, never mind the first 802.11 1Mb/s-capable efforts at wireless networking.
But what has changed to enable the same radio waves of the late ’90s, which struggled with 1Mb/s, to expand and be capable of shifting supposedly 9,600Mb/s in 2018? To answer that, we’re going to go back to basics, break out the slide rules, and knuckle down to some classic telecommunication theory.
So we know where we’re starting from, everything we’re talking about here is taking place on a standard sine wave. A wave has a wavelength, which describes how long it takes to repeat each full wave, which starts at a zero point, goes up to one, back down through the zero point to minus one, and back up to the zero point. It has an amplitude, which describes how big the peak is, like a volume level. The final measure is its “phase,” which describes where the wave is starting from—at the top of a wave, the bottom, middle, or something in between.
The specific wavelength of Wi-Fi itself is one of two (see the boxout opposite for more details), but the short answer is 2.4GHz or 5GHz. These are then broken down into subfrequencies called channels. If you want to avoid interference from other nearby transmitters, you need to be on separate channels.
Depending on where in the world you are, the number of available channels varies. However, it tends to be that 2.4GHz has far fewer, up to a maximum of 14. The issue is that the way these are allocated means they overlap frequency bands, and the practical outcome is there are typically only four frequency-distinct channels—in the USA, it’s just three.
At 5GHz, the channel situation tends to be better, especially in the United States—there are nine open distinct channels right off the mark. Unlike at 2.4GHz, they are all frequency-distinct. There are a further 12 channels licensed under Dynamic Frequency Selection; basically, some radars (often used for weather stations) operate in the 5GHz band, and these get priority. When a router is powered, it scans these channels, and if it detects a radar operating, it won’t use that channel. If at any time during use it detects a radar, it ceases use of that channel within 10 seconds, and leaves it free for 30 minutes, before rescanning.
Either way, that’s a far higher number of potential channels for use (21), though things do get more complex when multiple channels are bonded together. We’ll go into channels in more depth when we cover MIMO, because they become important for multiple streams.
So, we’re getting there in terms of knowing some of the basic elements of a wireless signal. The next key element to clear out of the way is: How do the router and devices know when to transmit and receive data? We’re not going into depth on this, and it’s a real nuts-and-bolts component, but Wi-Fi uses a protocol called Carrier Sense Multiple Access with Collision Avoidance, or CSMA/CA. It’s a halfduplex system, so either end can only receive or transmit, not both. It’s a robust system, using acknowledge
(ACK) and ready to send/clear to send (RTS/CTS) signals, with fixed data packets, and mandated waits. The end result of which means data can be sent and received in noisy environments, with unconnected devices trying to send on the same wavelength and the same channel. Devices can still connect, emails are still sent and received, and nobody dies, which is good.
We’re now at a point where we can ask: How is the data actually imprinted on the radio signal? It takes the form of a type of modulation—a system where the waveform is altered in a way so that the data can be retrieved at the receiving end.
There have been various types of modulation used before consolidating on the industry QAM; the various options all balance speed against robustness in noisy environments. Going back to ancient
history, the original 802.11 and 802.11b used the very slow Barker 11-bit “chip” sequence to modulate any signal, while direct-sequence spread spectrum or DSSS was used to multiplex the signal itself, and avoid interference. This pads the original data with pseudo random “noise” data, and makes for a very robust if very slow (1 or 2Mb/s) signal.
One big advantage that the original 802.11a standard had is that it used QAM—it also made it more complex and expensive, hampered implementation, and delayed uptake, enabling 802.11b to take a lead, but it was technically superior, like Betamax—that’s Quadrature Amplitude Modulation. Hurrah, we know some of those words!
Above, you can see an image of a traditionally drawn sine wave, with some numbered dots on it. Alongside this is a “polar co-ordinate” chart (a circular plot, in other words) representation of the same sine wave. This plots a full cycle of the wavelength around the 360 degrees of a circle, and you can see where the path tracks through.
Let’s send some data. We’ll say that if you receive a waveform that’s 0–180°, that’s a 0, but if it’s shifted to 180–360°, it’s a 1. We’ve just invented binary phase shift keying; shift the phase 180°, and you change the value. On a polar chart, you’ll see this gives a huge margin of error, making it very robust, and it’s the base modulation scheme for 802.11a/g/n/ac.
The obvious next step is to say why not encode a number for every 90 degrees of a phase shift? This is quadrature phase shift keying, and it enables you to encode two bits and twice the amount of data in the same amount of wavelength, but you half the margin of error.
This is where QAM steps in. As you might have guessed, it modulates not just the phase, but also the amplitude of the wave. If you now have four bits of data per quarter, this enables us to encode four bits per wavelength frame, but with the margin of error now a quarter of the size. This is called 16-QAM, and if you think this is impressive, 802.11a supported 64-QAM back in 1999, and 802.11ac supports 256-QAM. But at this level, the margin of error becomes even smaller. QAM is a sliding system— as noise in the signal increases, the router falls back from 256-QAM to 64-QAM, then 16-QAM, 4-QAM (QPSK), and eventually BPSK, reducing the data rate, but increasing the reliability as that happens.
IN BONDAGE
Earlier, we mentioned channels, which are bandwidth frequencies within the 2.4GHz or 5GHz
wavelengths. Back in the days of 802.11g, a number of non-standard products appeared—an annoying trend that broke compatibility—such as rangeMAX and SUPER-G, which offered enhanced speeds, but were incompatible with other “g” products. Some of these products used channel bonding, a transmission technique officially used in 802.11n to double data rates by using the spectrum of two channels going from 20MHz to 40MHz per channel, literally double the speed of transmission.
The problem—especially with the non-standard gear—is that in the 2.4GHz band, the number of available channels is limited to just three; outside of a residential setting, it’s almost certain that interference is going to be encountered, especially when you take into account Bluetooth, cordless DECT phones, and microwave interference, also in the 2.4GHz range.
When 802.11ac was introduced, additional channel bonding options were added of 80MHz and 160MHz, but only for the 5GHz wavelength. To be clear, the 802.11ac standard only works on the 5GHz bandwidth, and 2.4GHz capabilities are provided through the 802.11n specification.
BEAMING
So, we’ve improved the speed of wavelength, used faster channels, and packed more data into each cycle—what’s next? Sending more streams at the same time. The introduction of MIMO technology— Multiple Input Multiple Output, which is a fancy way of saying lots of antennas—was another case of the industry jumping the gun. Pre-N gear arrived packing MIMO, but it was never going to be fully compatible with final ratified N gear—though Draft 2.0 gear was.
MIMO isn’t a particularly difficult idea to get your head around—simply put, instead of sending a single stream of data, you bolt on another antenna and send two streams, or up to four with 802.11n. The 802.11ac standard bumped the maximum to eight. Be aware that not all antenna can be used. Typically, the format “TX x RX = Max” is used to indicate “transmit antenna x receive antenna = maximum actual streams.”
A technology that was introduced in 802.11n, but has been better deployed in 802.11ac, is something called beamforming. This is a signal processing trick, which uses destructive and constructive interference to deliver optimized areas of signal just where your device happens to be. It’s basically tuning the waveforms to enhance, rather than cancel out, each other.
With 802.11n, this system didn’t work as well with different manufacturers, largely because it had to fall back on “implicit mode,” where it has to assume the transmit and receive power is the same, with no feedback. While this can work, it’s rarely the case that the transmit path is symmetrical.
Far better and now improved with 802.11ac is “explicit mode,” where feedback can be sent in the form of complex vector tables, which contain information on the power of each subcarrier. This can increase the signal by 2–3dB, basically doubling the signal strength, making the difference between being able to support 16-QAM or 256-QAM. Because it’s possible to feed back as quickly as 10 to 25ms, this even works with moving devices.
Another thing to keep in mind, and another reason this system wasn’t optimized for 802.11n, is that it requires four antennas to offer both MIMO and beamforming to a single user. Beamforming is transmitting the same signal over two antennas to optimize signal strength, but it then requires two additional antennas to deliver MIMO capabilities, too. With 802.11ac being able to offer eight antennas, it’s capable of supporting four users with beamforming simultaneously, now called MultiUser MIMO, or MU-MIMO.
Some recent routers claim “triband” capabilities with two separate 5GHz bands. This non-standard 802.11 addition neatly enhances
routers to better support MU-MIMO, because they can simultaneously serve one device on the lower 5GHz channels and another device on the upper 5GHz channels.
THE AX MAN
That drops us neatly off at the future. The next iteration of Wi-Fi is upon us: 802.11ax has yet to be ratified, but Qualcomm has already released a number of 802.11ax-supporting chipsets based on draft standards. There were also various unreleased products being demonstrated at CES 2018, such as the D-Link DIR-X9000, a 4x4 MU-MIMO device due late in 2018, but the standard isn’t slated to be ratified until early 2019. So, should you care?
While 802.11ac squeezed more from the 5GHz band with MU-MIMO and greater channel bonding, 802.11ax is all about refinement, using the 5GHz bandwidth in smarter ways. While this is going to lead to faster connections, it’s largely aimed at supporting multiple users far more efficiently.
A key change is a switch from OFDM, a widely used type of multiplexing—used in Wi-Fi since 802.11a and g onward—to an even more efficient form, called OFDMA. We’ll avoid the complexities of this multiplexing—it’s the sort of thing that would take multiple classes at university to explain—but it closely overlays subcarriers orthogonally for efficient use of the wavelength, and key to OFDMA is that individual subcarriers can be assigned to individual devices. With OFDM, that is simply not possible. Plus, it frees up routers to switch to the best channels for specific devices. All of this cleverness, however, means OFDMA is much more costly to implement.
QAM has been increased to 1,024. This, alongside the shift to OFDMA, should increase available throughput fourfold. It’s also going to enable the MU-MIMO to work both in the downlink direction (from the router, as with 802.11ac) and also on the uplink side. Other subtle improvements are a doubling in the available guard interval—to avoid Doppler effects—for improved outdoor performance, plus a double and quad-length symbol duration for further efficiency.
It looks like the initial 802.11ax consumer routers will be pushing four antennas initially—so they can sell eight-antenna units the following year, we guess. Ignore the hyped 11,000Mb/s claims you’ll see; that’s using the old trick of combining the 2.4GHz and 5GHz bands together (some even do this with the 60GHz same-room network band). The truth is, until you have devices in your home that use 802.11ax, there’s no point rushing out and buying routers. History has shown that it pays to be patient with wireless technologies, but once settled, 802.11ax promises sounder, faster home networks.