Ten rules for dating my teenage daughter placing your Wi-Fi access points

1
top floor of test house
Dilate / The top floor of our test house is relatively straightforward—although like scads houses, it suffers from terrible router placement nowhere forthcoming its center.
Jim Salter

Here at Ars, we’ve spent a lot of time covering how Wi-Fi develops, which kits perform the best, and how upcoming standards will counterfeit you. Today, we’re going to go a little more basic: we’re going to teach you how to accept out how many Wi-Fi access points (APs) you need, and where to put them.

These oversees apply whether we’re talking about a single Wi-Fi router, a interlacing kit like Eero, Plume, or Orbi, or a set of wire-backhauled access points match Ubiquiti’s UAP-AC line or TP-Link’s EAPs. Unfortunately, these “governs” are necessarily closer to “guidelines” as there are a lot of variables it’s impossible to fully account for from an armchair a few thousand miles away. But if you behoove familiar with these rules, you should at least walk away with a elevate surpass practical understanding of what to expect—and not expect—from your Wi-Fi clothes and how to get the most out of it.

Before we get started

Let’s go over one bit of RF theory (radio-frequency) before we get started on our ten declares—some of them will make much better sense if you be in sympathy with how RF signal strength is measured and how it attenuates over distance and through checks.

Note: some RF engineers recommend -65dBM as the lowest signal level for maximum performance.
Enlarge / Note: some RF engineers recommend -65dBM as the lowest signal uniform for maximum performance.
Jim Salter

The above graph gives us some uncomplicated free space loss curves for Wi-Fi frequencies. The most weighty thing to understand here is what the units actually mean: dBM change over directly to milliwatts, but on a logarithmic base ten scale. For each 10dBM drop, the realized signal strength in milliwatts drops by a factor of ten. -10dBM is 0.1mW, -20dBM is 0.01mW, and so forth.

The logarithmic surmount makes it possible to measure signal loss additively, rather than multiplicably. Each magnifying of distance drops the signal by 6dBM, as we can clearly see when we look at the bold red 2.4GHz curve: at 1m mileage, the signal is -40dBM; at 2m, it’s -46dBM, and at 4m it’s down to -52dBM.

Walls and other obstructions—including but not limited to anthropoid bodies, cabinets and furniture, and appliances—will attenuate the signal spare. A good rule of thumb is -3dBM for each additional wall or other substantive obstruction, which we’ll talk more about later. You can see additional curves found above in finer lines for the same distances including one or two additional be ruins (or other obstacles).

While you should ideally have signal prones no lower than -67dBM, you shouldn’t fret about trying to get them much foremost than that—typically, there’s no real performance difference between a blazing-hot -40dBM and a considerably-cooler -65dBM, as far away from one another on a plot as they may seem. There’s a lot more going on with Wi-Fi than valid raw signal strength; as long as you exceed that minimum, it doesn’t in point of fact matter how much you exceed it by.

In fact, too hot of a signal can be as much of a problem as too absolutely—many a forum user has complained for pages about low speed trial results, until finally some wise head asks “did you put your will right next to the access point? Move it a meter or two away, and try again.” Accurate enough, the “problem” resolves itself.

Rule 1: No more than two resides and two walls

Our first rule for access point placement is no more than two stays and two interior walls between access points and devices, if possible. This is a unbelievably fudge-y rule, because different rooms are shaped and sized differently, and exceptional houses have different wall structures—but it’s a good starting sharp end, and it will serve you well in typically-sized houses and apartments with typical, reasonably modern sheet rock interior wall construction.

“Typically-sized,” at least in sundry of the USA, means bedrooms about three or four meters per side and sturdier living areas up to five or six meters per side. If we take nine meters as the mediocre linear distance covering “two rooms” in a straight line, and add in two interior bulwarks at -3dBM apiece, our RF loss curve shows us that 2.4GHz signals are doing terrific at -65dBM. 5GHz, not so much—if we need a full nine meters and two full walls, we’re down to -72dBM at 5GHz. This is certainly adequacy to get a connection, but it’s not great. In real life, a device at -72dBM on 5GHz will likely see nearly the same raw throughput as one at -65dBM on 2.4GHz—but the technically slower 2.4GHz connection wish tend to be more reliable and exhibit consistently lower latency.

Of seminar, this all assumes that distance and attenuation are the only problems we deal. Rural users—and suburban users with large yards—on likely have already noticed this difference and internalized the rule-of-thumb “2.4GHz is superior, but man, 5GHz sucks.” Urban users—or suburban folks in housing developments with postage-stamp yards—gravitate to have a different experience entirely, which we’ll cover in Rule 2.

Book image by Jim Salter

Comment (1)

Leave a Reply

Your email address will not be published. Required fields are marked *