The XOR problem in dimension 2 appears in most introductory books on neural networks. The problem consists four 2-dimensional binary data vectors, each with a binary output 0 or 1 according to the following table
(0,0)--->0
(0,1)--->1
(1,0)--->1
(1,1)--->0
The task is to obtain a multilayer feedforward, 2-input, single output neural network that that recognizes the data, that is, when evaluated on the binary vectors the outputs are as in the table. Most often it is used to illusrate that ---because the data are not linearly separable--- a single perceptron unit cannot possibly perform the task.
The XOR problem in dimension n consists of 2^n binary data vectors, each with a binary output equal to the sum mod 2 of the components of the vector.
For example, in dimension n=3 there are eight binary vectors and the table is
(0,0,0)--->0
(1,0,0)--->1
(0,1,0)--->1
(0,0,1)--->1
(1,1,0)--->0
(1,0,1)--->0
(0,1,1)--->0
(1,1,1)--->1
Information about how well backpropagation (or similar techniques) works on this problem, for moderate values of n, say, n=8, 9 or 10, is highly appreciated.