It doesn't really matter if the dataset consist of integer values, or not. If your class are linearly separable (i.e. they can be separate by an hyperplane in the n-dimensional space defined by your input of length = n) a perceptron will work, otherwise you will need to add at least one hidden layer and use back-propagation
Would go for neural network ( ff + bp ), depending of your problem you could use several layers, maybe use simple PCA to extract features first, or even go for modern auto-encoders, RMB's. Not a real expert, but perceptrons are really out-dated.
In classification problems is really important the kind of separability that you need for your data. I suggest to use a multilayer perceptron (MLP) because it possible to obtain more complex separation surfaces, of course, if your data are linearly separable, the MLP can do it.
MLP will provide you more flexibility to find a way to separate the data. If these data is already linearly separable both, MLP and perceptron will work.
single perceptron can only solve problems which are linearly separable. It do not matter if the class consists of integer numbers or not. On the other hand BP can solve non-linearly separable problems because of the multi-layer topology and the nature of the transfer function. If you have multiclass problem, then single preceptron can only separate between two classes. You have to think by a way to generalize this solution to the multi-class paradigm. There are standard ways to do this such as one- against-one and one-against-all. As a conclusion, if you have multiclass problem which is in most cases non-linearly separable,then use BP.
Hint: did you think to use other classifiers such as SVM or Adaboost?
In this position, there is not a great degree variation for both of these ones. Important side that is dataset paramteres must be related to each others. Like as a network line, input parameters can go to result parameters.