In order to single out the contexts under which two quantum observables are considered to have the same probability for spin up and or spin down, such that non-contextuality of probabilities singles out the unique probability function, there must be presumably other functions of the quantum states which are probability functions (satisfy the three axioms), and do not give mere zero/one probabilities, but which violate non-contextuality of probability.
1. I presume the contexts used by Gleason in his derivation (where he uses no contextuality of probability) are those contexts in which the particle is identically prepared and measured, such that the particle has the same relative phase and amplitude. I presume the other functions were not ruled out because the relative phase (or sign of the amplitude was different) or some such was different, or some other factor was different; and these other probability functions take these into account thus give different probabilities to the observable in the two different contexts
Presuming that the probability function is just a function of the state then is this correct?;
1. That two ways correlations (modal anti-correlations or entanglement) at some point,to establish equal probabilities at some point in the process (due to being equivalent in truth value) and that some other value would get this wrong
2. Or there is some factor such as phase or some such, that differs for the same observable but in different contexts, or which differs for some other two bases between the two contexts and so get the probabilities or the interferences wrong,
3. or the strict correlations and anticorrelations in standard entanglements wrong (which enduce equi-probability)
4.. or rather does does Gleason make use of not really make use contextuality (or modal entanglements between contexts) so much as heavy use of standard entanglements and non-contextuality of probabilties to establish his proof; and if so, is no counterfactual reasoning going on, (i would presume so) so that A in context 1 would have occurred if ~A in context 2 would have occurred or indirectly using standard entanglements; for example, there is some event that the observable is entangled with in actuality, due standard entanglement and the different measured settings in the context, alter its correlation with relation to A, from a strict positive correlation to an anti-correlation so that B occurs in context 2 iff A in context 2 and B occurs in context 1 iff ~A occurs in context 2, then uses non contextuality again to establish that B is equip=robable in both contexts, thus A in context 2 is equally probable with ~A in context 1, but by noncontextuality A in context 1= probability A in context 2 and thus equals ~A in context 2 correlated in the which would have occurred in both.
5. Presumably one could argue that the probability function, not the values it gives as such, (whilst an in-deterministic function nonetheless) changes, so that it delivers non-contextual probabilties; which then accords with the born rule. I am not talking about non-contextual probabilities. Just something that gives the same prescriptions as the born rule, at least insofar as we have investigated. (if for example there do exist other functions of quantum state, the amplitudes and phases which are probability functions, but give contextual probability assignments, is it possible to use one function in one text, and totally different function in another, for whilst they deliver different probability values to A in two different contexts it might be that function one gives A the same probability in context 1, as function 2 does for context 2, so that this mixed function delivers non-contextual probabilities and the same probability values which accord with the born rule.
I presume then its just a matter of occams razor then its simplest to have a constant function. I presume its not possible to this mixing process, preserve non-contextuality of probabilities, the fact these functions are probability functions and give a different prescription to the born rule, otherwise some further presumption about it being the probability function being a constant function of the state would have to be presumed to deliver the born rule verdicts. But if these amplitudes are not invented post hoc, and were already there in the formalism and had some meaning, then i guess its less otios; one could say the amplitudes mod squared give the value of the probability; and perhaps these are easier to measure but not these are the probabilities themselves; if probabilities are ontic entites they could be some other thing whose function changes (so that the mixed values do not change across context) but deliver the same verdicts as the born rule. Of course one could have mixed non-probabilistic functions that average out or change so as to deliver probabilistic verdicts (if A occurs, the probability of A was 0.9 and ~A 0.4 and if ~A occurred its probability was 0.1, A's probability was 0.6) but that is going over the top
So i presume the only assumptions are 1. continuity (which follows from non-contextuality in many cases, 2 .non-contextuality, that its 3. representable by a probability function, that it is a 3.b probability function, and then the 4. empirical restriction to three dimensions. There is not really a presumption that its a function or constant function of the state, at least insofar as deriving the born rule probabilities values are concerned (it follows from the non-contextuality and entanglement) unless is overly concerned with uniqueness of the born function, as opposed to the values that it delivers; or that its the only function that gives those values (in which case its the only constant function which does so, instead)
.