Mighty iPhone Power Ranges
Oh, those darned iPhones. Can't live with 'em, can't keep your job without 'em.
The vagaries of iPhones and other station devices are the most difficult part of managing a WiFi network, but there are some things that can be done on the infrastructure to try to make your stations work better. One of those things is lowering your AP transmit power to a level that more closely matches your client station's transmit power.
My main man G.T. Hill (of Ruckus Wireless) recently wrote a blog post discussing why this post is bullshit. Now I'm going to tell you why his blog post is bullshit. (sorry, G.T.)
G.T.'s primary point is that is is borderline mentally handicapped (politically correct term) to turn your AP's power down. His theory is that even if your client stations transmit at low power levels, having a high AP power level at least allows the from-AP data rates to stay as high as possible. (G.T. goes on to add that most traffic is downstream, thus making it all the more important to maintain high from-AP data rates. I have found this to be incorrect, so double-sorry, G.T.)
I wanted to check out an example using my iPhone (which, from what I gather, transmits at 10 dBm) connecting to an AP in a small office (which probably transmits at 17 dBm or 18 dBm). I ran a Skype test call so that I would make sure that both the AP and the iPhone would transmit plenty of frames.
Just as G.T. hypothesized, the higher transmit power of the AP allows the AP to transmit at a high rate. The small office AP was acting as an 802.11g AP (even though it's probably an 802.11n AP configured to use TKIP encryption) and, sure enough, the small office AP used the highest possible rate (54 Mbps) for just about every transmitted frame:
Both 9.4% and 16.7% Retrys are bad numbers (though this was for a small office, where bad numbers are expected because of neighboring WiFi networks), but 16.7% is a helluva lot worse. About 1.77 times worse.
And why is it worse? Most likely because the AP transmit power is set too high. The iPhone has no idea what the AP transmit power level is, so when the iPhone receives a frame at a good signal strength, the iPhone assumes that a high rate can be used when the iPhone transmits. The iPhone doesn't realize that its transmitted frames go out about 7 or 8 dB lower than the AP's frames. The end result is the iPhone using rates that are too high (and if I were able to show you the entire capture of the iPhone's transmitted frames, you would see many failed attempts at 54 Mbps), causing lots of Retrys on the channel and thus, SLOWING THE ENTIRE CHANNEL DOWN.
(I cannot make the point strongly enough that RETRYS SLOW THE ENTIRE CHANNEL DOWN. A massive Retry percentage like 16.7% makes my iPhone slower, every other device on my iPhone's network slower, every other device on any other network the AP is offering slower and every other device on any network that any AP on my channel slower. Retrys are a bad, people. And the only way to get an accurate gauge of Retrys is by sniffing WiFi. It's why I cared enough about sniffing to start a free, advertising-bereft blog.)
The bottom line here is that even though my friend G.T. has phenomenal taste in fast food, a lovely wife/kids/family and a solid history of only spending money on things that will help his career or increase in value (sorry, G.T., I couldn't resist), I believe he is wrong here. I have found that designing a WiFi network with AP power levels lower to match station power levels works best, and my experience gathering statistics from real-world WiFi sniffing supports that.
The vagaries of iPhones and other station devices are the most difficult part of managing a WiFi network, but there are some things that can be done on the infrastructure to try to make your stations work better. One of those things is lowering your AP transmit power to a level that more closely matches your client station's transmit power.
My main man G.T. Hill (of Ruckus Wireless) recently wrote a blog post discussing why this post is bullshit. Now I'm going to tell you why his blog post is bullshit. (sorry, G.T.)
G.T.'s primary point is that is is borderline mentally handicapped (politically correct term) to turn your AP's power down. His theory is that even if your client stations transmit at low power levels, having a high AP power level at least allows the from-AP data rates to stay as high as possible. (G.T. goes on to add that most traffic is downstream, thus making it all the more important to maintain high from-AP data rates. I have found this to be incorrect, so double-sorry, G.T.)
I wanted to check out an example using my iPhone (which, from what I gather, transmits at 10 dBm) connecting to an AP in a small office (which probably transmits at 17 dBm or 18 dBm). I ran a Skype test call so that I would make sure that both the AP and the iPhone would transmit plenty of frames.
Just as G.T. hypothesized, the higher transmit power of the AP allows the AP to transmit at a high rate. The small office AP was acting as an 802.11g AP (even though it's probably an 802.11n AP configured to use TKIP encryption) and, sure enough, the small office AP used the highest possible rate (54 Mbps) for just about every transmitted frame:
Also as G.T. hypothesized, the rate for frames transmitted by the small office AP would likely have been lower if the AP's power level were reduced to match the iPhone's power level. We can deduce that from the fact that the iPhone was routinely transmitting frames at a lower rate (48 Mbps) than the AP:
As expected, G.T. is correct about the rates. If your primary goal is to have the highest possible rate for all frame transmissions on your WiFi network, chances are that setting your AP and station transmit power levels to the highest possible value will help you achieve that goal.
The problem with this whole discussion about rates is that having high rates really should not be your primary goal. High rates are attractive because it's fun to see a big number when you mouse over your system tray (or, for us Mac OS X users, when he hold the Alt key and click on the WiFi icon in the top menu bar), but for most enterprise WLANs high rates fall somewhere around fifth on the list of things to look for when evaluating whether the WLAN sucks or not.
My criteria for not sucking:
- Drops/re-authentications are rare (meaning that every captive portal sucks)
- Roaming works for mobile devices
- Discovery traffic is minimal
- Retrys are low (indicating that whenever the wireless gets busy, it'll still probably work)
- Rates are high
(Please notice that throughput/goodput, packet sizes and single station usage are absent from this list. One could write an entire blog post about how much time is wasted analyzing those numbers.)
High percentages of Retrys are more damaging than low rates (in most cases) because Retrys waste channel time. There is a finite amount of time available for data to get across each channel. When a Retry happens, the time for the failed frame was wasted because all stations and APs must stay quiet during each frame transmission. Low rates for data frames also waste time, but for most enterprises the budget is there to install plenty of APs so that a decent signal can be had anywhere.
High percentages of Retrys are more damaging than low rates (in most cases) because Retrys waste channel time. There is a finite amount of time available for data to get across each channel. When a Retry happens, the time for the failed frame was wasted because all stations and APs must stay quiet during each frame transmission. Low rates for data frames also waste time, but for most enterprises the budget is there to install plenty of APs so that a decent signal can be had anywhere.
Now take a look at the Retry numbers for the communication between my 10 dBm iPhone and the 17/18 dBm office AP (while also taking a look at how much extra time it takes to do menial tasks when you go cheap and use Wireshark instead of WildPackets OmniPeek or Fluke AirMagnet WiFi Analyzer):
And why is it worse? Most likely because the AP transmit power is set too high. The iPhone has no idea what the AP transmit power level is, so when the iPhone receives a frame at a good signal strength, the iPhone assumes that a high rate can be used when the iPhone transmits. The iPhone doesn't realize that its transmitted frames go out about 7 or 8 dB lower than the AP's frames. The end result is the iPhone using rates that are too high (and if I were able to show you the entire capture of the iPhone's transmitted frames, you would see many failed attempts at 54 Mbps), causing lots of Retrys on the channel and thus, SLOWING THE ENTIRE CHANNEL DOWN.
(I cannot make the point strongly enough that RETRYS SLOW THE ENTIRE CHANNEL DOWN. A massive Retry percentage like 16.7% makes my iPhone slower, every other device on my iPhone's network slower, every other device on any other network the AP is offering slower and every other device on any network that any AP on my channel slower. Retrys are a bad, people. And the only way to get an accurate gauge of Retrys is by sniffing WiFi. It's why I cared enough about sniffing to start a free, advertising-bereft blog.)
The bottom line here is that even though my friend G.T. has phenomenal taste in fast food, a lovely wife/kids/family and a solid history of only spending money on things that will help his career or increase in value (sorry, G.T., I couldn't resist), I believe he is wrong here. I have found that designing a WiFi network with AP power levels lower to match station power levels works best, and my experience gathering statistics from real-world WiFi sniffing supports that.
Thanks Benjamin. This article is worth studying. Very informative and educative views provided above and mighty station power ranges research project is truly appreciative!! Keep up coming more information of this project.
ReplyDeleteRandall