Today, we did a sequence of tests at increasing power levels on single frequencies in the 80 and then 40M bands. The general pattern is that a low-power (<= 100W) transmission causes a narrow (possibly single tone) range of tones to be swapped. After the swap (causing a narrow notch in the bitloading graph), another 30 sec transmission at 100W was enough to cause CRC errors and loss of RG sync. After the RG comes back, the bitloading graph shows a drastic change has occurred. The entire 80M band has been swapped out, and tones in the downstream 2 segment are allocated. Afterwards. even high-power operation may cause low to moderate FEC rates, but no further loss of sync is seen.
The same progression is seen with 40M tests. Low power works OK and adapts "gently" by reassigning one or a small number of tone channels. High power is likely to cause loss of sync, followed by a large swath of the download 2 tones being reassigned to the download 1 range. The former 80M gap is reclaimed. After the 40M resync, the number of reserve bits is reduced to ~1,000 from ~2,400 (on RG reset) and "bits in use" is increased to ~84.5% from ~69.9%. Max. downstream rate decreases to 28,486 kb/s from 34,640 kb/s. (The "recommended profile" drops to 19/2 from 25/2.)