The amount of clipping dB has direct effect on the signal to noise ratio and thereby on the bit error rate. Namely it INCREASES the bit error rate.
Clipping distorts the signal and cuts a part of the waveform. This clipping leads on the effect on the signal constellation displacing the constellation points from their original position. If a constellation point is displaced from its original position where it can be decoded without error into the decision region of the adjacent constellation points it will have error in its coding. As the signal to noise ratio increases the decision region of every constellation region increases and it tolerate more clipping distortion. So, clipping distortion has an effect as noise.
Yes, it can impact on the SNR performance, particularly, with high peak to average power ratio (PAPR), this is mainly since the high PAPR increases the noise at the amplifier, which in turn requires increasing the received power and then the SNR.
Clipping distortion has noise-like influence. When the signal to noise ratio (SNR) increases each constellation region's decision region, and it tolerates more clipping distortion. Thus, the amount of dB clipping has a direct effect on the ratio of signal to noise, and thus on the rate of bit error (BER).