Logistic regression is used when the outcome is binary (0/1). Linear models are used for continuous outcomes. If the outcome is a whole (counting) number, you could use Poisson or Negative Binomial regression as necessary, but linear models are usually good enough. Also, if the outcome is ordinal (e.g. "good" vs. "fair" vs. "bad"), it is possible to use ordinal regression, a variant of logistic regression.
The outcome variable does not have to be normally distributed. Only the residuals need to be normally distributed (after adjusting for predictors), though this is rarely a requirement for publication in the real world.
The type of regression that is appropriate to a given case is selected based on the distribution of the dependent variables. In this case the dependent variables do not follow a Normal Distribution meaning that one of the underlying assumptions for multiple linear regression would be violated if the test was performed. Logistic regression is used when the dependent variables follow a Binomial Distribution (technically a Bernoulli distribution which is just a particular case of the Binomial).
Logistic regression is used when the outcome is binary (0/1). Linear models are used for continuous outcomes. If the outcome is a whole (counting) number, you could use Poisson or Negative Binomial regression as necessary, but linear models are usually good enough. Also, if the outcome is ordinal (e.g. "good" vs. "fair" vs. "bad"), it is possible to use ordinal regression, a variant of logistic regression.
The outcome variable does not have to be normally distributed. Only the residuals need to be normally distributed (after adjusting for predictors), though this is rarely a requirement for publication in the real world.
Conceptually, GLM is an extension of Linear Models that include a support for binomial distribution based on "link function" called "binomial" (0/1). Really, When you are using this link function for binary data (0/1), it is a logistic regression.
For logistic regression, it is possible to have the outcome expressed as grouped counts (e.g. 8 out of 10 trials succeeed for this combination of independent variables). You will want to use logistic when there is an upper limit (it is not possible to score more than 10 out of 10). If the upper limit is unlimited, for example the number of cars passing on a freeway per hour, then you would switch to linear, Poisson, or Negative Binomial regression.
Ruvinda, if you use logistic regression then you can have only one outcome variable (which should probably be "Result" from the information you provided). Another thing to consider with using logistic regression is that typically, logistic regression is used to identify risk factors for the outcome of interest based on your group of subjects, some with and some without the outcome of interest, and each with various levels or presence of the risk factors (independent variables). But you can also use logistic for predictive modeling, running it "in reverse" so to speak, where you can input a single individual's pattern of risk factors (weight, age, infection, etc) and the model will predict whether that individual will have the outcome or not, based on the logistic model you developed for the entire sample (or a training sample). Obviously with linear regression you can do that too if the outcome variable is continuous.
I've never actually done this aside from playing with some data a couple of times, but I believe you can specify the predictive model to predict the outcome (predicted to occur or not) with a probability that you specify (correct me somebody if I'm wrong on this!) I think the default in SPSS is 50%, i.e., if there's at least a 50% chance the outcome will occur for an individual, based on your logistic regression model using the entire sample, then the model will predict that it will occur for that individual. But of course it might not actually occur, so there's a likelihood or confidence level involved. Your logistic model for the entire sample will give you a classification table comparing the number of subjects with the actual vs. predicted outcome, and the percent correctly classified. You can then compute a kappa statistic for the actual - vs. predicted outcomes to see how well the model agrees with your actual outcomes.
Another option with logistic regression is to use multinomial or polytomous logistic regression where you can have more than one outcome variable of interest. For some reason this isn't widely used, but from the problem you present (two possible outcomes?) it might be appropriate.
And finally you should consider your sample size requirements. There is no concrete rule with logistic regression per se, but a rule of thumb is that you need at least ten to twenty subjects for each predictor variable - e.g., if you have three predictor variables then this would require at least 30 to 60 subjects for your logistic regression model.
As Oliver Gutiérrez Hernández said Logistic regression "kind of" generalized linear models. Actually Generalized linear models is a flexible tool that allow you to adjust the probability distribution in you odel according to your data type and solve a "linear equation¨. For example in you case you outcome (0,1) can be described by the Binomial function. In this case GLM will do something like this:
logit(outcome) = beta0 + betax1 + ... + betaxn
You can obtain much more information them only OR from this model. Actually if you used the inverse function of logit ( exp(x)/(1+(exp(x)) you can extract the mean value...
A good start to understand better this thing is Dobson book "A introduction for Generalized Linear Models"