Looking at some datasheets, I see the Aruba APs' receiver sensitivity is pretty good. For example, for AP-315 we can have -90 dBm (802.11n HT20 2.4 GHz MCS0/8 or 802.11ac VHT20 5 GHz MCS0). Though this seems to be good, when thinking about channel overlap or CCI this seems to be a downside instead of an upside. Even with a good design and channel plan in 2.4 GHz, it is easy to hear more than -90 dBm from a far overlapping AP with low transmit power, causing CCI and degrading performance. Then, how good is to have this low numbers for receiver sensitivity? How can we avoid channel overlap taking into account this good receiver sensitivity?
The receiver performance numbers apply in a noise-free environment. Ultimately, what matters for the receiver is signal to noise ration (SNR) rather than the absolute signal level.
In a 20MHz channel, a sensitivity of -90dBm translates to a need for about 11dB SNR. (thermal noise floor is around -101dBm).
In your example, if we pick up -90dBm as CCI noise, we would need a signal level of -79dBm or better to decode the data.
A less sensitive woudl still be affected by the noise, and need even more signal power to decode the data.
Hope that helps.
Thanks for your reply. That makes much more sense, and got the general idea. But, in the example that the AP has a sensitivity -90 dBm, how do you pass from -90 dBm of sensitivity to requiring -79 dBm of signal strength?
The receiver can decode a signal at -90dBm reliably if there's no noise in the environnment. That is how this is tested, in a clean RF environment (-101dBm thermal noise florr only).
You suggested an environment where there's CCI noise coming in at -90dBm. In that case, the actual signal will need to be stronger to overcome the added noise.
At Aruba, we believe that the most dynamic customer experiences happen at the Edge. Our mission is to deliver innovative solutions that harness data at the Edge to drive powerful business outcomes.
© Copyright 2021 Hewlett Packard Enterprise Development LPAll Rights Reserved.