We are using Giving Up Densities to determine the perceived risk in voles.

This is a forced choice setting, with 3 different foraging patches, each with their own risk level.

Some individuals started to forage in the "dangerous" patches and store it in the "safe" patches, bringing them above their initial food levels.

How do I account for this in the statistics?

We are going to report proportions (food remaining/food initially), so that values should range between 0 and 1, however the latest individual stored so much food that it pushes the value to 1.2.

My idea was to either force a cut-off at 1, or to subtract the additional food, i.e. turn the 1.2 into 0.8.

Neither option seems perfect.

Thanks in advance!

Similar questions and discussions