I'm not sure I understand the intent of your question. Are you asking how to make a Hall effect sensor (not a Hall sensor, which presumably detects someone walking down a hall? =) more sensitive for an EMC testing application, or are you asking how to make a Hall effect sensor less susceptible to EM interference?
I'd say it really depends on where the interference is getting into the system, and what's the target application. Since it's a sensor that detects a magnetic field, anything that shields the sensor from magnetic fields would obviously impact the sensor. For static magnetic field detection, a non-ferrous electric shield around the sensor would still allow static magnetic fields to pass through and thus the sensor would work more or less as expected. However, rapidly changing magnetic fields would be at least partially shielded by the eddy currents induced in the shielding conductor. Shielding the leads from the sensor to the detection circuitry would potentially help, but may induce other problems if not properly terminated. A poorly handled shield can be worse than none at all. Without the shield, appropriate twisted pairs would help minimize coupling to outside signals.
Twisting all three together would have a similar effect, but if we were to assume that the supply voltage is relatively insensitive to noise, then the critical leads are the signal and reference (presumably ground) from the sensor, since the same amount of differential noise applied between the signal lead and ground will be much more detrimental than it would on the supply leads, since the voltage you're detecting and amplifying is generally much less than the voltage being applied across the sensor.
At any rate, what twisting wires does is reduces the net area between them relative to an applied EM field, thereby reducing the apparent inductance and the overall coupling. Put another way, if you imagine a static magnetic field applied across the gap between two parallel wires, twisting them results in an oppositely oriented field on the second half of the loop to that applied to the first half of the loop, so they cancel. EM waves are of course not static, so the quality of this isolation depends on the number of twists per wavelength, so that the localized fields are still in the same direction and can cancel out. That's why on network cabling, the quality of the cable (e.g. CAT3 vs. CAT5) is determined by the number of twists per foot.
Actually I am under impression that twisting helps for relatively low frequencies. I have problems with susceptibility to hundreds MHz signals. But anyway, I will follow your kind advice.
In an application I used an Hall effect sensor in order to sense for high DC currents (about 100A). Precisely it was a LEM DHAB family sensor.
During EMC tests of the system we encountered several sense issues at certain frequencies. The troubling tests was the IEC-61000-4-6 and 61000-4-3. The big trouble was that the RF "noise" coupled with the cables was like demodulated by the hall effect sensor that output a constant DC voltage that is not corresponding to the real DC current flowing into the cable during the test. I think that the RF signal described by the IEC standard is demodulated by the hall sensor and the real output cannot revealed nor the noise can be filtered out.