I am currently designing and troubleshooting a Flouresence Anisotropy based assay in which the anisotropy of an NBD-labelled drug molecule is being used to determine the binding affinity to a transporter protein.

I have run into a particular issue, in which the anisotropy of the drug molecule alone decreases with increasing concentration (carried out in 1x pbs). From what I have read online, the anisotropy of the probe should not change as a result of concentration.

This has lead to being unable to determine binding affinities from titrating ligand concentration. The only way I have been able to see changes in anisotropy due to binding has been with a fixed concentration of the probe and varying the concentration of the protein.

Does anyone have any ideas on why this is? I would appreciate any help.

More Darius Chernitsky's questions See All
Similar questions and discussions