Hi. I tried to use ReLU activation function on XOR problem to see its performance because I see a lot of post and page said it's better than sigmoid and others. The full post with code snippets was post on forum of the Neuroph Framework which I'm using to do that. Please help. thanks
https://sourceforge.net/p/neuroph/discussion/862858/thread/36006e43/