Wireless Access

Reply
Moderator
Posts: 16
Registered: ‎04-21-2009

Wi-Fi for BYOD: AP-135 Testing Shows iPads Stay Connected Over 120 Feet

(view in My Videos)

 

At Aruba, the Technical Marketing team spends a lot of time thinking about performance. One of the key aspects of performance is “rate vs. range” testing. There are a few testing tools and methodologies out there and we wanted to take some time and explain how we typically go about this.

 

The first piece to figure out is what types of APs and clients you have in your environment which need to be characterized together as a system. We have a mix of different types of clients including legacy 802.11a/b/g, 802.11n MIMO, Broadcom Macbook Pros with 3 Tx, 3 Tx chains, and 3 spatial streams or 3x3:3 for up to 450 Mbps data rate, Intel 6300-based laptops (3x3:3), iPads (1x1:1), and many others in between. The client driver and OS can also make a difference in the performance and should be updated if possible and noted.

 

The AP-135 is our latest generation 802.11n silicon supporting 3x3:3 for the WLAN infrastructure. The highest performance can be achieved when associating 3x3:3 clients to our 3x3:3 AP-135s. We want to look at performance both close into the AP as well as at varying distances, both for line of sight and non-line of sight locations.

 

For repeatability and reproducibility, we want to ensure we have set the transmit power to maximum, selected a particular channel of operation, turned on any performance optimizations for the controller, and are testing in a clean air (RF) environment. We want to ensure that we are using a specific client (Macbook Pro or iPad for example) running a particular version of software for all test runs. Lastly, we want to ensure that the testing locations are well defined, and the client orientation is constant across test runs.

 

For testing tools we have used Ixia Chariot (Generic Dell Server for the wired endpoint, Ixia wireless endpoint 7.10 - SP 3 on the client) in the past with good results. We test both TCP and UDP throughput scripts, leveraging the same scripts for performance tests (usually 4 or more endpoint pairs or traffic flows to ensure the channel is fully utilized), and we run each test multiple times and average the results across the test runs to even out small variations in the data.

 

We have also more recently used the Veriwave Wave Deploy  Pro for rate vs. range testing (EF1101 hardware and WaveAgent 3.50.0_2011.10.29.05 for the client). The benefits of Wave Deploy Pro include the ability to interface with the client directly and obtain the PHY associated data rates as seen by the client at each testing location. The tool also provides throughput testing with different mixes of traffic and different simulated applications where one can set an SLA that must be achieved for a successful test run (4.0 MOS for voice for example).

 

So what were some of our test results? We looked at the PHY rate of association using the Intel 6300 (3x3:3) Lenovo laptop and Wave Deploy Pro, and found that it maintained an average of 450 Mbps at 15’, 320 Mbps at 30’, 240 Mbps at 70’, and 65 Mbps at 120’. Only the 15’ location was line of site and all other locations had multiple walls and obstructions. The PHY rates translated to over 200 Mbps of downstream throughput at 15’, roughly 125 Mbps at 30’, 80 Mbps at 70’, and over 50 Mbps at 120’. The upstream TCP numbers were similar but a little lower than downstream.

 

Our Macbook Pro (Broadcom 3x3:3) client performed a little bit better than the Intel client up to 70’, achieving over 250 Mbps downstream throughput at 15’, over 200 Mbps at 30’, over 100 Mbps at 70’, and over 30 Mbps at 120’. These numbers were similar for both downstream and upstream tests.

 

We found that the iPad (1x1:1) device while supporting slower MCS data rates, had a very strong rate of association and obtained over 20 Mbps of TCP throughput all the way to 120’. The UDP numbers were over 50 Mbps at 15’, over 30 Mbps up to 70’, and dropped to over 25 Mbps at 120’.

 

Clearly these are maximum performance numbers you can expect for each client type given the clean RF testing environment (our Proof of Concept Lab in Sunnyvale) and use of high performing clients, setting of the AP to maximum transmit power, etc. For higher AP densities and when using ARM to set channel and power dynamically, expect a lower transmit power to limit the range of the AP. But this also leads to more APs/radios available for clients to associate with which are located in closer proximities. This also leads to support of higher densities of clients demanded in today’s BYOD enterprise environments.

 

Let us know if you have any thoughts or suggestions for future testing work and hope you enjoyed this post.

Sr. Director Business Operations, Aruba Networks
Search Airheads
Showing results for 
Search instead for 
Did you mean: