Hello,
I am having a problem with some RF handheld units in our warehouse. They are older units and we've been playing around with 1 Mbps and Auto bitrates (configured on the device). A bitrate of 1 Mbps seems to give us more reliability in terms of less packet loss, however when the bitrate is set to 1 Mbps the device fails to roam to another access point.
We can successfully associate and connect to the AP in our office, however once we go out into the warehouse it's not long until we are out of range and disconnect completely. Changing the device's bitrate to 'auto' immediately reconnects and allows it to roam freely. When set to auto, the unit will operate at 5.5 or 11 Mbps.
The only difference is that the AP in the office is an AP-135 and the AP's in the warehouse are AP-205's, with the profiles being the same. The ARM settings and SSID bitrate settings are all default. This is a controller based setup.
I'm not a wireless expert by any means so I ask, can bitrate affect a device's ability to roam and associate? Why does a lower bitrate result in less packet loss?
Thank you.