"..the presence of a legacy 802.11b participant will significantly reduce the speed of the overall 802.11g network. Some 802.11g routers employ a back-compatible mode for 802.11b clients called 54g LRS (Limited Rate Support)."
It is normal, in each communication standard, to allow the back compatibility the new communication device adopt the old communication standard to add the old communication device to the network.
Both 802.11g and 802.11n do not perform well with legacy system such as 802.11b because 802.11b nodes (slow clients) occupy the wireless channel slightly longer period and consequently slowing down the operation.
Actually, the answer is slightly more articulated. The 802.11g standard expands the set of transmission rates of 802.11b with new modulation schemes that allow for higher bitrates (if the signal to noise power ratio at the receiver is good enough, of course). Now, in a network with only 802.11g terminals, everybody can understand everything, therefore each terminal (and the AP) can transmit using any one of the available rates. The presence of an 802.11b station, instead, yields two effects.
1) For backward compatibility, all packets that are sent or transmitted by 802.11b terminals need to be modulated with a modulation that can be decoded by these terminals, therefore the new modulations brought about by 802.11g cannot be used. However, 802.11g terminals can still use such modulations. Broadcast packets, of course, need to be transmitted at a rate that can be decoded by all terminals in the network. In this respect, the presence of a few 802.11b terminals in the network does NOT impact much on the transmit rate of the other terminals.
2) The second impact of the presence of 802.11b STAs in an 802.11g cell is that, when transmitting at lower rates, 802.11b stations will occupy the channel for a longer time than what needed by a station that, in the same conditions, could use a higher transmit rate. Since only one terminal at a time can transmit over the channel, the presence of slower terminals will somehow affect the long-term throughput of faster ones, which have to wait longer to get access to the channel. This throughout loss, however, is significant only when the traffic generated by 802.11b nodes is very high, i.e., so that they access very often the channel, which is almost "saturated". In normal conditions, however, this extra delay is basically negligible.
You can take a look at this webpage to learn a little bit more about these aspects:
Thanks everyone for your justification why 802.11b can slow down the operation of 802.11g. This has been studied very well and you can find the answer in many networking textbooks as well as white papers.
the maximum speed of b release is 11 Mb/s (1, 2, 5.5, 11 Mb/s depending on CCK code type and modulation type) and the g release works with 54 Mb/s using OFDM