ok
To answer this we first have to understand the concept of integrated noise in the band.
To determine thermal noise power we use the expression kTBF
k is Boltzmann's constant
T is temperature in Kelvin (usually assumed to be 290 K (17C))
B is the channel bandwidth
F is the noise figure
to put this is dBm we take 10*log(kTBF) or 10*log(kTB) + F(in dB)
Since we are only worried about bandwidth we can see that doubling the bandwidth will increase the thermal noise in the receiver by 10*log(2) = 3 dB
So from 20 to 40 is 3 dB
from 20 to 80 is 6 dB
If 20 MHz is 100% of the range
40 MHz is 71% of the range
80 MHz is 50% of the range
160 MHz is more interesting
in 160 MHz the radio is actually split into 80 + 80 MHz. So each 80 MHz block has the same integrated noise. HOWEVER, in 160 MHz the antennas are divided across the two block so that reduces the MRC (Maximal Ratio Combining) beneift by 10*log(2) or 3 dB more. (i.e. a 80 MHz 4x4 radio becomes 80+80 2x2 radio)
so relative to 20 MHz 160 MHz is 9 dB less or 35% of the range of the 20 MHz channel with the full antenna set.
So for a system running MIMO and the same MCS rate is considered
20 MHz 100%
40 MHz 70%
80 MHz 50%
160 MHz 35%
Hope this makes sense and answers your questions.