Galileo has deduced the law of gravity (1/2 g t^2) by observing balls rolling on an inclined plane. However, without the occam razor, there is no reason to infirm the following law: until today the law of gravity is 1/2 g t^2, and tomorrow the law of gravity will be -1/2 g t^2. This law satisfies the criterium of Karl Popper. It is a scientific law. Without the occam razor we have to wait tomorow hoping that I was wrong and Galileo was right.
We have the same problem for machine learning. Given a set of data, and a powerful learning machine such as SVM or ANN, there are an infinity of solutions which fit the data. To make the problem well posed, I need to find the simplest that fit the data.
Ockham's razor (properly Ockham, actually, rather than Occam) has been faslified a few times quite spectacularly, and as such is questionable (see Lawrence Krauss's latest book for a quite spectacular example)
Furthermore, its application is often squarely in the eye of the beholder ..... As a case in point, Ockham's razor has been invoked both to argue against a multiverse (Michael Frayn, etc.) and to argue forcefully in favor of a multiverse (Max Tegmark et al.)
All in all, probably a loose guideline in many cases, but which can never replace other approaches, and which has to be taken with a pinch of salt.
To reduce the reasons for refusing the '-!/2gt² prognosis' to a general principle, such as 'Occam's razor', is misleading. If we would live in a world in which all signs in natural laws change from day to day, this would be a good prognosis, irrespective of Occams razor. Actually, the acceptability of a proposed law depends on the available experiences and insights in the discipline under consideration. Lazy people, not willing to familiarize themselves with these experiences and insights, ask for simple principles which do not exist.
In about 1963 the philosopher Wolfgang Stegmüller held a half-year course on 'scientific explanations' at the University of Munich in which he also discussed cases like the '-!/2gt² prognosis' . He gave a rich set of criterea, each with surprising examples of misguidance by them. In any of these cases it was not difficult for an educated physicist to give good reasons why the proposed criterion could not work in all cases. To me the following analogon suggested itself: Being at home in some science is like being at home in one's mother language. To any question of the type 'is it correct to say ... ' the native speaker has a quick and save answer. But he has typically major difficulties to derive this answer from general and simple principles of his language. Nevertheless he has the clear feeling that it was a simple question. This simplicity is an illusion, and true simplicity can be digget out only by hard work.
Actually Galileo found out that g is equal for different bodies. Low of gravity belongs to Newton (or Hooke, if you wish). But the main thing, I did not understand at what moment Occam’ razor was applied in obtaining of gravitation law equation?
The Occam razor leads us to choose the simplest law:
1/2gt^2 rather than 1/2 gt^2 if t < tomorow and - 1/2 gt^2 else.
This principle is used in information theory to choose the best codebook: it is the one which minimizes the number of reconstruction errors (fit the data), and which minimizes the number of bits to transmit it (the simplest).
If we would like to transmit the two laws, the first one is the best. It fits the observations and it costs a lesser number of bits.
The strange think for me, it that we need the Occam razor to explain the world, and may be it is misleading as said UIrich. The proposed law seems very stupid, and we reject it immediately. However, Planck have been granted of the Nobel price to propose the same kind of idea: the laws of physics was different at the begining of the univers.
We have the same problem in machine learning:
every time, we would like to solve a real problem, we search simple solutions, but as we say devil is in the details, and the pure mathematical concept degenerates into a gas factory.
@Raphael,
I just read the 'Award Ceremony Speech' held 1919 at the instance of Planck's Nobel price and found no reference to laws of physics at the beginning of the universe. What work of Planck you are referring to?
Era of Planck or Planck's wall. My understanding (I am not a physicist) is that the laws of physics were different behind this wall.
@Raphael: you wrote:
... to choose the best codebook: it is the one which minimizes the number of reconstruction errors (fit the data), and which minimizes the number of bits to transmit it (the simplest).
This is nothing else than an optimization task. I don't see here any problem clearly related to the Occam razor. You both examples seem more like "monkey with razor" to me. Are some of your "codebooks" more complicated than others? Probably not, one is the permutation of the other, right?
@Marek
Any set of data can be represented by a string of symbols from a finite alphabet.
If you cannot compress your data, the Kolmogorov complexity of your data is equal to the size of the data. It is the definition of a random sequence.
Conversly, any regularity in a given set of data can be used to compress the data, ie to describe it using fewer symbols than needed to describe the data literally. This description is the codebook.
Can I suggest you the reading of "Elements of Information Theory" by Thomas Cover ?
or http://en.wikipedia.org/wiki/Minimum_description_length ?
or http://en.wikipedia.org/wiki/Kolmogorov_complexity ?
...
I hope you will find that is not exactly a problem for "a monkey with razor."
Raphael,
back to Planck: Planck's natural units (e.g. for length, time) were introduced by Planck soon after the action quantum h was established. The Nobel prize came much later and was given for the discovery of the blackbody radiation law, and not for the 'soft stuff' you mentioned.
Ulrich,
Ok, you are right for the Nobel price.
I just mentionned the wall of Planck to show that we cannot reject a proposition just because it seems incredible.
What would Newton have said, if I have said to him that the law of gravity depends on the speed ?
We can reject a proposition because it does not explain the observations such as the modelization of gravity of Galileo, of Newton, or because there is a simplest one that also explains the observations: we need the Occam's razor.
Ockham's razor (properly Ockham, actually, rather than Occam) has been faslified a few times quite spectacularly, and as such is questionable (see Lawrence Krauss's latest book for a quite spectacular example)
Furthermore, its application is often squarely in the eye of the beholder ..... As a case in point, Ockham's razor has been invoked both to argue against a multiverse (Michael Frayn, etc.) and to argue forcefully in favor of a multiverse (Max Tegmark et al.)
All in all, probably a loose guideline in many cases, but which can never replace other approaches, and which has to be taken with a pinch of salt.
@Raphaël: Nobody knows how to compute the Kolmogorov complexity precisely in practice. In your case you have a simple test to evaluate every possible "codebook" for every particular case: the shorter resulting code, the better is your codebook. There is no place for Occam razor here. The similarity, if any, is purely accidental. The Occam razor is inherently related with introduction some extra parameters (objects), possibly not existing before, to explain your hypothesis. This criterion only tells you which new objects are sufficient, and which are unnecessary. Your codebooks are already existing and your task is merely to check which one of them is better. Not necessarily the best. A Huffman code is one of the known proposals.
I agree with Chris, Ockham's razor does not always describe reality.
"The basic idea behind Ockham's razor is that in a choice between two ideas, both of equal explanatory power, then one should choose the simplest. ( For example, the best law is the simplest one that fits the data-the example is mine and is not part of the original text).
It is used for more than that, however: as A.R. Lacey’s A Dictionary of Philosophy states, “A stronger form claims that only what cannot be dispensed with is real and that to postulate other things is not only arbitrary but mistaken.”
Armed with this information, let us notice what Ockham’s Razor is not. It is not a law of mathematics like the Pythagorean Theorem. It is not a constant, like gravitational force. It is not even an arbitrary designation. (As Bertrand Russell observed, even in the deepest regions of space there are still three feet in a yard. Naturally, since humans invented the yard.) Speaking of which, it’s not a natural law either.
Ockham’s Razor is a principle; which is to say, a piece of advice. Like “Look before you leap” or “He who hesitates is lost.” Its truth-value is entirely dependent on its functional utility in the real world. Therefore when someone speaks of a concept or theory violating Ockham’s Razor, this only means that the concept or idea is incompatible with maximal simplicity. It is not like saying a concept or idea violates the Second Law of Thermodynamics, or has as its consequence that 2 and 2 now equal 7.
The reason this is important is because frequently people use Ockham’s Razor as if it were some sort of immutable principle, rather than a piece of advice that may or not be useful according to context. Seriously employing Ockham’s Razor would have ended progress every time such an idea came forward. Relativity is more complicated than Newtonian mechanics, but has the virtue that it explains certain things (or more precisely, certain conditions) better.
In short, no one should ever feel diminished by the observation that one’s theory fails to pass the Ockham’s Razor test. It is a lingering byproduct of an age long since passed. Simplicity may be a feature of one’s idea or it may not be, but this is not logically equivalent with truth. As always, the best arbiter of such matters is the accumulation of evidence, which is to say reality itself."
-----------------------------------------------------------------------------------------
Dulling Ockham's Razor, By Joseph E. Green, Dissenting Views, Feb 13, 2010.
Modified Aug 13, 2013 by the author
@Marek
You wrote:
" the shorter resulting code, the better is your codebook. There is no place for Occam razor here"
I do not understand. Choosing the simplest solution or the shortest code is an application of the Occam razor.
@Issam
You wrote:
"Seriously employing Ockham’s Razor would have ended progress every time such an idea came forward. Relativity is more complicated than Newtonian mechanics, but has the virtue that it explains certain things (or more precisely, certain conditions) better."
I do not agree. Relativity is at this moment the simplest theorory that explains the observations. Ockham's razor principle (sorry for Occam but I am french) balances the simplicity with the accuracy: the simplest theory or the simplest model that fits the observations or the data.
How do you refute my gravitation law (with the observations of Galileo) without the Ockham's razor ?
Without Ockham's razor, there is no science. Indeed, given any set of observations, you can find an infinity of functions or of laws that fits the data.
As you said Ockham's razor is not a theorem. It is a principle.
My point of view is that the Ockham's razor exists because we need it to decide.
For example, the Ockham's razor leads us to decide that tigers are always dangerous. In fact it depends on a lot of factors, but those, that have tried to explain all the factors before running, have not had progeny.
The Ockham's razor allows us to propose an useful explanation of the world. May be it is wrong. Probably the relativity is wrong as the universal law of gravity, but it is useful to study the fast moving objects.
Raphael
Nature cannot be explained by choosing between a simple and a simpler explanation for its behaviour; that is too naive. Newtonian physics(NP)-(3D +t) cannot explain the phenomenon at high speeds. this is why SR was developed. At the microscopic end of physics QM was developed but again this has its limitations. Physicists are now having to develop quantum gravity in order to describe the force of gravity according to the principles of QM. If we had to choose between simple and simpler solutions a lot of the physics developed so far, SR, GR, QM AND QG could not have been achieved. Our theory of the very small, QM, appear to be incompatible with our theory of the very large NP, SR AND GR. We need to quantize gravity to be able to consistently handle both quantum fields and gravitational phenomenon. This is a terrifying new world of physics where we find 1-dimensional objects tracing out worldsheets in 10-dimensional, 3 geometries evolving (or not) with respect to a super - Schrodinger equation, with little, if any, contact with even potentially observable events.
Nature is beautiful but certainly its behaviour is not a choice between a simple or simpler theory.
@Issam
I think that there is a confusion: the simplest solution does not mean a simple solution.
The world is complex. It is not suprising that the physics becomes more and more complex as we make more and more precise observations.
The solution of Galileo seems simple for a physicist of our times, but deducing 1/2gt^2 from that the speed of falls follows dx/dt . t without knowing anything about the integral calculus was not simple for a physicist of the XVI century.
The Ockham's razor (I verified, even the french academy prefers Ockham even if at his times Guillaume d'Occam spoke probably in french) is necessary to refute my gravitation law given the observations of Galileo (my gravitation law can be rejected with current observations).
My point is that we need exactly the same principle for machine learning: we need to generalize from a finite set of data to a new set of data. In order to acheive this goal we need the Ockham's razor or from a Bayesian point of view a prior. Else the problem is an ill posed one: there are an infinity of solutions.
Does it is a regularity of our world ? Does it proof something about God ?
I do no think so. We use the same tool to study physics and machine learning: the human brain. The Ockham's razor teaches us more about the functionning of our brain that about the functionning of the world. We need to simplify in order to understand and then to act.
The obvious question to ask about Ockham’s razor is: why? On what basis are we justified to think that, as a matter of general practice, the simplest hypothesis is the most likely one to be true? Setting aside the surprisingly difficult task of operationally defining “simpler” in the context of scientific hypotheses (it can be done, but only in certain domains, and it isn’t straightforward), there doesn’t seem to be any particular logical or metaphysical reason to believe that the universe is a simple as it could be.
Indeed, we know it’s not. The history of science is replete with examples of simpler (“more elegant”) hypotheses that had to yield to more clumsy and complicated ones. The Keplerian idea of elliptical planetary orbits is demonstrably more complicated than the Copernican one of circular orbits (because it takes more parameters to define an ellipse than a circle), and yet, planets do in fact run around the gravitational centre of the solar system in ellipses, not circles.
Because of quantum theory we now have two kinds of physical theories, those that work at a large scale and those that work at the scale of individual atoms. these theories are incompatible — the very successful theory of relativity isn't expressible in quantum terms, and vice verse.
Currently,we don't have a theory that works at all scales-unified theory. However, an element of such a theory is an equation written by Paul Dirac in 1928. Dirac's equation successfully predicts the behaviour of particles moving at relativistic velocities, so to some degree it reconciles the relativistic and quantum views of reality.
While writing his equation Dirac realized it had two possible roots. At that point, Dirac could have decided his equation was only an approximation of reality (there are plenty of those), or he could claim his equation accurately described nature, therefore nature allowed two different kinds of matter, with positive and negative signs. Dirac decided his equation described nature and in so doing his equation implied the existence of a new form of matter, antimatter. Had Dirac applied the principle of Ockham's Razor he would have very likely dismissed the negative root of his equation and in doing so dismissed the existence of antimatter.
This principle goes back at least as far as Aristotle, who wrote "Nature operates in the shortest way possible." Aristotle went too far in believing that experiment and observation were unnecessary. The principle of simplicity works as a heuristic rule of thumb, but some people quote it as if it were an axiom of physics, which it is not. It can work well in philosophy, but less often so in cosmology, where things usually turn out to be more complicated than you ever expected. Perhaps a quote from Shakespeare would be more appropriate than Ockham's razor: "There are more things in heaven and earth, Horatio, than are dreamt of in your philosophy.".
Simplicity is subjective and the universe does not always have the same ideas about simplicity as we do. Ockham's Razor cannot be thought of as a substitute for insight, logic and the scientific method. It should never be relied upon to make or defend a conclusion. Only logical consistency and empirical evidence are absolute. Dirac was very successful with his method. He constructed the relativistic field equation for the electron and used it to predict the positron. But he was not suggesting that physics should be based on mathematical simplicity and beauty alone. He fully appreciated the need for experimental verification. In 1932, four years after Dirac formulated his equation, Carl Anderson discovered the positron.
I admit, Ockham's Razor is a sharp tool but by all means is not universal and should be used with extreme care.
@Issam Sinjab: your example of Copernican vs. Keplerian orbits is simply excellent. There are two reasons for that:
- there is a qualitative difference between those two models, not like the choice between various codeblocks
- Ockham's razor clearly fails in case of planetary orbits, while the "codeblocks problem" has an unique "quality factor" (shorter code is better) thus nullifying the necessity for any other criteria, philosophical or otherwise.
The application of Ockham's razor implies two terms: the simplicity and the accuracy.
The best model is the simplest that fits the observations. Not the simplest. Else the null model is in every cases the best one !
The example of Copernican versus the Keplerian is a good example to understand what is the Ockham's razor:
Using the information theory, you quantify the lenght of description of the model and the lenght of description of the errors done by the model. You sum this two terms, and you choose the model which minimizes this sum.
The Keplerian has to be chosen: it is a little bit more complex, BUT it explains better the observations.
Issam,
Dirac was not in the case, where he has to choose between two possible equations: a complex one which explain the observations and a more complex one which explains the observations and implies the existence of antimatter.
If he was in this case, Dirac was just a lucky gambler.
The choice of Dirac was "my equation is correct or wrong." If it is correct it implies the existence of antimatter. It is a deduction made with the mathematical tool. No problem with the Ockham's razor.
Raphaël, I'm truly impressed. You are stubborn, but in effect you have presented us something what may be called "quantitative version of Ockham's razor". Maybe you are the first to make such a claim. It looks really new and highly valuable for me. Repeating the comment given by one of my referees, many years ago: there is a germ of good idea here! I'm tempted to continue this story: it took me full 6 months of really hard work to improve that paper but now it is my best cited article.
In summary: I think you are on the right track, but your idea has to mature a bit. The first obvious problem seems to be that you will be almost certainly using different codeblocks for two components of your sum. Both may be manipulated, thus influencing the numerical value of the output. Maybe something like Cramer-Rao inequality (the one from statistics) or like Heisenberg uncertainty principle should be appropriate then? If you succeed, then not only the machine learning will be among the beneficiaries. Good luck!
Marek,
Quantifying Ockam's razor is nothing new. It is referred to as the minimum description length principle in which you describe data and pick the description with the smallest sum of model and error. This approach is not only not new, it is substantially limited. There are simpler and more general formulations available via Bayesian formulations of the problem.
Raphael,
You may be French, but William of Ockham was originally English and spelling the name of the English town he was probably born in as Ockham has little to do with French names. He went to Avignon as an adult ... exactly why is a question of debate.
From Avignon, he went to Bavaria, but that doesn't make him German or change the spelling of Ockham.
Raphael, if I may use a bit of levity here, maybe you ought to use Ockham's razor in your use of the English language, and possible redundancies therein ? In particular, your use of 'the' is most often quite wrong
On Ockham in Bavaria:
In Munich (München Schwabing) one named a street after him. The official name of this street (Straße in German) is Occamstraße (sic!). By the way, Ockham's razor once was digged out near the Occamstraße and can be viewed in the Valentin-Musäum since then.
Raphael
In light of the example of planetary orbits, you have defined Ockham's Razor as: "The best model is the simplest that fits the observations. We have a Keplerian model which is complicated but fits the observation and a Copernican model that does not fit the observation but is considerably simpler, and If we use your definition in the strictest sense of the word there is no simple model that fits the observation there is only a complicated model that fits the observation! It appear that you reached your conclusion only because you knew already that the Keplerian model is the true one and so it naturally fits the observation and not as a result of your definition of Ockham's Razor.
Considering Dirac's equation, there are two solution (roots) of the same equation and so both solutions have the same level of complexity(or simplicity). However, only one solution - the positive solution for matter-fits the observation known at the time. And so using the definition: "the best model is the simplest that fits the observation" would have dismissed the antimatter solution since no antimatter was known at the time. Of course Dirac did not use Ockham's Razor I am just illustrating what could happen if we blindly trusted the principle of Ockham's Razor.
The problem with Ockham's Razor is that it is biased against complexity. Given a choice between a simple cosmos and a complex cosmos Ockham's Razor will always opt for the simple, and justify this judgment on the basis of common sense. The weakness of Ockham's Razor is the failure to recognise that the cosmos is, in fact, hideously complex. The idea that the simplest explanation is invariably the best one sounds good in theory, but when it is actually applied to the real world, results and conclusions may not be correct.
I have illustrated two examples where Ockham's razor failed and they cover the very large classical Newtonian physics(mechanics) of planetary orbits to the very small quantum physics (mechanics) that describes particle behaviour. I am not at all dismissing Ockham's razor I am only saying the Ockham's razor is not and must not be used as a substitute for insight, logic and the scientific method.
It is also common to state Ockham's razor as stating that entities should not be posited without evidence. There are many instances in machine learning where it is simple to show that positing such entities (known as hidden variables) allows learning to occur whereas not using hidden variables prevents any practical learning from occurring. The reason is that the hidden variables allow the mathematical formulation of the model to be massively simplified, thus allowing the weight of evidence to be applied in a more focussed manner.
Some good examples occur in hidden Markov models, in generative language models such as Latent Dirichlet Allocation, and in many other kinds of mixture modeling. Some non-parametric methods actually posit an infinite number of such hidden variables.
A recent case is found in the research on deep learning. Recent work in sum-product networks as described below critically depends on this.
http://research.microsoft.com/apps/video/dl.aspx?id=192562
Ted,
On the XIV century the notion of nation did not exist. The kingdom of England was lead by the Plantagenets, who were also duc of Anjou, Maine, Touraine, Normandie, Aquitaine...
The aristocrats, the knights, the clergymen spoke french. Most of them did not know write, and those who knew wrote in latin. Guillaume d'Occam was a theologian, and it is very unlikely that he wrote his name William of Ockham. That is why I prefer Occam's razor rather than Ockham's razor.
Thomas M. Cover, who wrote a great book on information theory, "Elements of Information Theory, also uses Occam's razor, I presume for the same reason.
When you said "Bayesian formulations of the problem", what problem you refer ?
Coding, Machine Learning, scientific method ?
Chris,
What a good idea ! Apply the Ockham's razor to reduce the number of mistakes made in writing english.
Marek,
Thanks for you comment.
My idea is that the scientific method needs the Ockham's razor, and we do not know if this principle is true. It is an assumption. It seems correct. Until today, the physic laws describe well the universe. But tomorrow, it is possible that there is a kind of Planck wall behind which the laws change. I think that it is possible, because the Ockham's razor comes from us. Our brain needs it. May be the universe not ?
Issam,
Devil is on the details, that is probably an hidden meaning of the works of the theologian Ockham.
Apply Ockham's razor to Physics is not simple, because we do not know how to evaluate the Kolmogorov complexity.
I suggest a very simple experiment:
1/ you write the Keplerian model on a file x1.
2/ you write on the same file x1 the errors made by the Keplerian model every day during one year.
You do the same thing for the Copernican model to obtain the file x2.
Then, you zip each file and you compare the size of the two files.
I bet a coin for zip(x1) < zip(x2)
Ted,
There are at least five frameworks for machine learning: Minimum Description Lenght, Bayesian, Structural Risk Minimization, Probably Approximativly Correct, Reinforcement learning. Currently my preference is for PAC learning and Reinforcement learning, but it is just a preference. You cannot say that the Bayesian point of view is better than another one. Bayesian framework is efficient when the prior is a good one, else it is the worst.
I never think about HMM using the MDL framework. May be, it is not the right framework to analyse them. Concerning Deep learning and Neural Networks, the MDL framework is relevant. Deep learning works well because it uses sparse coding. When you stack hidden layers, the coding of the problem is on the last hidden layer, not on the weights.
Anyway, it is an interesting machine learning discussion, but may be off topic.
One of the biggest challenges to Occam's Razor was the question of how best to apply it to quantum mechanics. Einstein reckoned that since we already had classical mechanics, and CM with quantisation could explain QM, the simplest explanation of QM was to say that CM continued operating below the quantisation threshold, even though quantisation foiled our efforts to directly verify that this was the case ("Hidden Variable Interpretation", HVI). There was no reason to invent a whole new "unnecessary" type of physics, and doing so violated Occam's Razor.
However, many of Einstein's colleagues saw the problem differently. By starting with quantum mechanics, they argued that since statistical mechanics could explain quantum physics without recourse to hidden variables, there was no need to invent an underlying layer of classical physics whose existence could not be directly verified /on principle/, and that it was instead Einstein who was introducing an unnecessary element to the model.
Both groups used Occam's Razor to argue that what the other group was doing was needlessly complicated.
If Occam's Razor is a presumption in favor of simplicity, the major problem with its application is vagueness in what counts as simplicity. Is having fewer fundamental forces in physics simpler than more fundamental forces? Probably so because it results in a system with fewer types of things. Is having fewer species of animals simpler than more species of animals? Probably not. So, unless we have a much better idea of what counts as simple, any use of Occam's Razor is likely to turn against us.
And indeed, I'm not sure why, in the original question, it's simpler if the future continues to resemble the past (on the continuance of the law of gravity). The idea that the future will continue to resemble the past seems to imply that there is something that makes the future be like the past, and at least in one sense, it would be simpler if no such type of thing existed.
God does not always shave with Occam's razor, even in computer science. See this paper:
https://www.researchgate.net/publication/2417662
Article God doesn't always shave with Occam's razor — Learning when ...
Interesting approach to prune a decision tree, and interseting paradigm: all gods are possible ?
I discuss that Occam's razor serves us to a point in that what we are trying to do in our physical description of nature is to provide a *compression* of information:
https://www.researchgate.net/publication/256838918
....however, not all things are compressible. So that's when it falls over. In physics we tend to focus on those phenomena that are compressible into neat models, and so Occam's razor tends to be helpful. However, there is an unexplored range of possibilities where our traditional techniques may fall over. In the areas of Big Data and Complex Systems we have to approach things a little differently.
Article The Reasonable Ineffectiveness of Mathematics [Point of View]
I read your paper, and I share the same point of vue that Mathematics is a clever human invention for describing the world. In the same vein, I think that Occam's razor is simply the only way to make the scientific approach well-posed. So it is a principle derived from Mathematics, even if it was proposed for theological purpose.
It must be noted that Mathematics are older than writing. Ten thousand years ago, people have built lunar calendar in Scoland. This implies knowledge and formalization transmitted through the age without writing by a kind of scientific community.
(see http://intarch.ac.uk/journal/issue34/gaffney_index.html)
And for what purpose ? Clearly not for farming in Scoland at 8th millennium BC. Probably for the same reasons we do, just to describe and understand (with some killer applications, such as event prediction, needed to be funded).
In the area of machine learning, Mathematics remain the cornerstone, even if a part of community has an experimental approach which is usefull for applications. Indeed, at this time Mathematics failed to model complex system such as brain, but we are just beginning to work on it, and we have a lot of preliminary results.