The standard proof runs as follows: You start with the tensor algebra of the vector space in question, say V. Then you take the quotient by the ideal generated by elements of the form

v \otimes v - q(v) 1 , for all v in V

Here q is the bilinear form on V. That works for me, the problem comes when I try to see that this ideal is not so big (the ideal is proper and non trivial). The standar reference [H.B. Lawson , M.L. Michelson, Spin Geometry] says; well calm down, this is not a big deal. Take an element in the intersection of this ideal with V, then in one hand this element has the standard expresion as a sum of products of elements in the ideal with elements in the tensor algebra. On the other hand, is an element of V, so all the term of maximal degree should be zero, and then comes the obscure trick I can't figure out. He takes a weird contraction with q, and concludes by induction that the previous fact (elements of maximal degree vanish) imply that the element itself is equal to zero.

Please, if you can clarify this last part (or gave me a fresh reference where I can find another proof scheme) I'll be grateful.

Similar questions and discussions