This is a question just to stimulate a discussion on the place of the complexity science in the landscape of all the solid, well established sciences: a super science of all the sciences or just a methodology?
for me, the epistemological status of Complex systems is a nightmare. First and above all, too many different things are being called Complex systems, they are "too many" in as much as they are not compatible. The oldest one is Complex system as a transcendent form of inter-discipline (not just additive, what we would call a multidisciplinary approach), but one that requires a dialog between the disciplines -actually dialectic interdiscipline-. This form is probably the oldest, it was the practice of Rolando Garcia (see Drought and man by Garcia and others) and the epistemological elaboration in "Sistemas complejos -Editorial Gedisa-). Garcia co-authored with Piagget in Psycogenesis and history of science, the foundation of the genetic epistemology, which the epistemological frame for his Sistemas complejos (no translation to English as far as I know)
Next we have the most common use of "complex systems" grounded in the development of simulations of interacting agents. There is no real epistemology in this, it continuously fails on Poppers' terms (lacks falsability).
Third, there are some vague attempts to throw into the pot all what is somehow odd, be it chaos, fractals, self-organization, .... This is just another paradigm and paradigmatic group.
In short, the divide between "normal science" and "revolutionary science" (in Kuhn) or "normal science" and "heretic science" in Bourdieu runs across the field in a very visible form.
If I take Garcia's meaning and generalize a little bit, Complex systems are just Science in the old way, the science practiced by the founders. Instead, if I take the definition in the Wikipedia, it is a just a method for normal science.
If you take the etymology of complexity you find that it comes from the latin "complexus", which means "that that is weaved collectively". In that sense it is a stricto sensu science according to Popper. For instance, if I say that the properties of NaCl are different from those of Na and Cl separately, this can be perfectly tested and one can try to refute it with some evidence of the contrary. On the other hand it runs opposite to the reductionist view. A detailed knowledge of the units no longer helps us understand the complex. I feel that it still lacks a good epistemological understanding and it seems to me that this understanding itself is complex (not to be confused with complicated).
Firstly, thanks for yours comments. They help me to think a little more about Complex Systems.
Perhaps because this moment I am working with agents simulations I feel more comfortable to trie to see Complex Systems as a relationship. Even among agents or sciences.
Personaly I don´t like too much of the expression "super science" because sounds like a "big science" more important than other. Well, science is science... doesn't matter each one we are talking about.... and again, we have relationships. Relationships among different areas of science.
I am sure that I need study much more..... but at this time I see the Complex Systems as a different approuch about the same things that already were there before, but instead to look just to a particular thing, we pay attention on its connections trying to understand the whole thing...
Reductionism was and is important. Almost everything that we have in sciences was building using a reductionist approuch and it is useful to understand the parts but Complex Systems says we need to go ahead a little more and don´t stop in the parts.
Thinking like this, perhaps Complex Systems looks like a methodology, but as I sad, I need study more....
The question was about the epistemological status of complexity sciences relative to other sciences. Typically, a complex system is a dynamical system consisting of two parts: a rule or 'dynamic,' which specifies how a complex system evolves, and an initial condition or 'state' from which the system starts." Some dynamical systems evolve in exceedingly complex ways, being irregular and initially appearing to defy any rule. The next state of the system can not be predicted from the previous one. Henri Poincare discovered that the reason for this did not lie in the rules for how the system evolves but rather in specifying the initial conditions that the rules start from in their application. More exactly, such a complex system (more than complicated) is characterized by "sensitive dependence on initial conditions". This situation is even more difficult to analyze given that each node of such a system has its own characteristic temporality that is uneqilly heterochronically liked to other nodes a varius levels within the system.
It turns out that in mathematical theory the change from order and predictability into unpredictability or chaos for dynamic systems is governed by a single law, and that the 'route' between the two conditions is a universal one. According to Pietgen and his colleges: "Route means that there are abrupt qualitative changes--called bifurcations--which mark the transition from order into chaos like a schedule, and `universal' means that these bifurcations can be found in many natural systems both qualitatively and quantitatively." This suggests that, like other science, complex systems are predictable. This is a hasty conclusion. Put another way, chaos is a type of non-linear behavior emerging along a universal route. At a certain point along this route organizations become highly sensitive to initial conditions and may abruptly change. A series of ever increasing self-reinforcing "errors" are made.These continously repeated errors become amplified and redefine the functions of the organization which in turn redefines its structure. The errors increase the organization's sensitivity to small changes in the environment (sensitivity to initial conditions) which in turn cause large changes in the organizations structure. Thus "...process and structure become complementary aspects of the same over-all order of process, or evolution. As interacting processes define temporary structures...so structures define new processes, which in turn give rise to new temporary structures." The morphogenic rule driven by the heterochrony of its nodes leads to the "emergence" of unpreditable structurers. Traditional normal science cannot deal with such complex systems chracteriing them as "madness." As far as I can tell, the autonomous agency simulations are the best way to investigate such systems but raise serious questions about what is a scientific proof. But wait, this is not the end of the story.
Mathematically, Godel explored the mathematical implications and complications related to more generally to asperiations of closed predictable systems long ago. The philosophical work of De Landa, Deleuze, and Guattari attempt to provide a philosophical foundation for complex systems like that above, particular related to emergency. And physics, well the conundrums of the foundation proven by quantum theory have fully undermined the 19th century physics that we still drag forward. Thus, to me, at least superficially, the complexity sciences and "science" are in a similar place.
I have been working in statistical complexity measures. In that field the notion of complexity is not unique. There are many definitions of what this measure is giving different results. I like very much the review made previously by Gus Koehler.
Nicely stated in a way that opens things up. You seem to point at the problem of emergence that suggests that something is emerging out of nothing. It is emerging from within a complex system's boundaries, but not because the description of the production system left a set undefined. The view is often taken that we just haven't gotten to a hidden variable due to an incomplete description as you suggest, and that it will be found later accounting for the cause of the emergent.
Perhaps another take on this is that because the production system cannot be defined in such a way as to include everything in the universe, the production system description must be incomplete and subject to unpredictable perturbations leading to an apparently unaccounted for emergence. True but not very helpful from a scientific viewpoint seeking a more or less single cause prediction.
Complex systems theory suggests that there is some mechanism of causality that is emergent itself due to some sort of continuous error making function, almost darwiniean. I also think it may have something to do with science having an incomplete theory of time. This is a problem for physics and quantum theory which, at the most fundamental level, lacks time. In fact time doesn't seem to emerge until we have a biological organism.
Complex systems are self-referential too as in the case of how the Manelbrot set is built out or "grows".
Different kinds of noise (pink, brown, etc.) also affect the timing and unfolding of bifurcations. I suspect this would also be a problem for a complete description of the production system since some noise is by definition fully random. And, as you suggest, there are more forms of unpredictability that stem from many other causes lying outside of the system.
I guess my view is that causality in complex systems suggests that the traditional notion of an overriding single, large cause that is determinative imay not be finadable close to a chaotic bifurcation point whereas a very small "sensative condition" may produce a complete change in the system. Thus there appears to be an inability to fully explain causality in both quantum physics and complex systems theory, if for different reasons.
We Are Not in Kansas Anymore: Beyond Classical Science
I agree with Suteanu (2005. Complexity, Science and the Public. Theory, Culture & Society, 22(5), 113-140), who argued that the complexity sciences demand a reshaping of the three pillars of classical science: measurability, reproducibility, and predictability. A new scientific worldview is emerging that includes these classical science concepts in its toolbox, yet which extends their meaning with new twists that completely transcend our previous understanding of them. The new scientific worldview deeply honors the usefulness of the three pillars, but in taking the complexity turn, a new spin has been put on them.
Measurability
The concept of Measurability has been changed as a consequence of fractal theory. It was Mandelbrot who demonstrated that there is no best scale on which to get the most meaningful measures. The best scale, in fact, is what one achieves when one takes into account the results of all the scales and finds the relation connecting them, their patterns. This use of multiple scales is not measurement in the classical sense of measuring units, and the issue of measurability can no longer be seen as strictly a matter of the limits of technology.
Reproducibility
Complexity has altered the link between reproducibility and relevance. In the classical science framework, for an event to be meaningful, it must be reproducible. But complexity and sensitive dependence on initial conditions point out that quite important events are unique and rare. When we take multiple scales and multiple perspectives into account, and acknowledge that a system is comprised of numerous interactions in circumstances in which details keep changing, reproducibility in the classic sense is untenable.
Predictability
A core tenet of complexity is that in a system far from equilibrium, factors which had previously appeared negligible can suddenly become highly salient. Suteanu regards this understanding alone, if taken seriously, as sufficient to cause the face of scientific studies to change. And because sensitivity to initial conditions can be of such importance, conditions are continually reset, and yet we can never know initial and subsequent conditions with infinite precision, we cannot know for sure how a system will evolve. We often find that our predictive accuracy decreases exponentially with time.
Classical sensibilities concerning predictability have been transcended by the growing understanding that the choice of what constitutes a system is a matter of perspective and scale. Making the choice of what a focal system is entails choices of what constitutes the beginning of the system and therefore what was before, what factors are to be considered as inside or outside the system, and which of the inside factors are considered relevant. Such choices are not ontologically 'given' nor are the mechanisms for making these choices. Luhmann pointed to a central operation of social systems as the reduction of complexity. Environments are always too complexly complex so systems selectively reconstitute this complexity for their own purposes . This is also a central concept of autopoiesis. Living systems regenerate and maintain themselves in the face of the vast wilds of complexity in the external environment by selectively determining what of this complexity they will relate to. Most systems are not conscious, and even when they are, the choices made to reduce complexity are most often not conscious. We must also note that even among such conscious systems as scientists and researchers, choices, unfortunately, do not often enough rise above the level of assumption. The emphasis on dyads in social research, for instance, is a pervasive practice that reinforces the drastic reduction of complexity against which the complexity sciences serve warning. The role of Richard Dawkins ‘selfish genes’ and any related beliefs in the primacy of the gene in biological development are just as reductionist and deterministic. So too are the prevalent economic theories. How perfectly reliable is predictability when we exist in vast complexity yet choices are continually being made to reduce complexity to manageable levels?
When I was taking Calculus I, eons ago, I was made aware that limits not always commute. Much later, in Physics I learned that mathematicians would use "as small as needed" while physicists used "as small as measurable", while things went along well permuting limits and thinking "as small as measurable" in terms of "as small as needed" it was just a matter of time for those things to reveal they are not precisely the same. Thus, the unpredictability of chaotic systems is more a sales speech of the chaos community than a problem. What it simply says is that when we look to problems not by their rules (the problem in itself) but from our prejudice we are bound to run into problems. To understand chaos is to understand the set of rules that allow us to see as logical and even obvious, what appeared as unpredictable and difficult when viewed by the rules learned for other problems. Chaos changes our prejudices rather than changing science, it shakes some of our basic believes and we are forced to reconsider things, but this, precisely this is what science is about! It may change how "normal science" is practiced, but it does not alter "heretic science" which always deals with the "problem in itself".
Nothing of this alters reproducibility, unless you try to reproduce what is not reproducible by its own nature. Science is about understanding the regularities, and finding regularities is always a matter of projecting out some attributes, hence it is reductionist, there is no form to achieve understanding or conceptualization or abstraction (your choice) without a reductionist projection, reduction is the dual of abstraction. However, we must be deeply aware that the reduction that is good to find some answers may not be as good for other questions, hence any theoretical understanding must state clearly the conditions of applications (for all x in X) as much as the conclusions.
All too often, models in complex systems are presented "as is", they work when they do (or what is the same, we do not detect their mistakes) and when they do not work we learn nothing or almost nothing. Science is not about prediction and is not about falsation either, yet both are needed, for science is the dialectic process between both of them, or as Garcia wants it, there is a dialectic between the integration phase and the differentiation phase of research.
Interesting points of view.Many thanks to all the participants of this discussion ! To be just a little provocative; we have a nice mathematical theory of differential equations, extended by Rene Thom to the so called Catastrophe Theory . This can be applied to many systems abstracted from different domains of the real world. So something similar to the Complexity approach but not called at all the Science. So why this differentiation ? Well, complexity gives us some bonuses like emergence, self-organization etc. But is it enough to promote it to the Science per se ?
I assume that complex systems can turn the page titled reductionism.
It concerns the use of linear logic to explain complex processes.
It begins to appear more and more subjective component. And this objective.
1.
You can take a subjective set of objective facts and get an "objective" results.
You can take a different subjective set of objective facts and get another "objective" results.
We obtain a logically noncontradictory chain of one of the many "virtual worlds" are far from reality. We "know" the end result under the facts and choose the result unconsciously.
The result - no full picture.
It can be seen in historical scholarship is particularly bright.
Illustration - Russian anecdote about dumb "blondes."
One blonde says another, "I now know where the light disappeared in the kitchen when I press this button." She walks to the refrigerator and opens it: "Yes, here it is, the light inside!!"
2
Another feature of complex systems - the development of processes for the catastrophic scenario.
Various small fluctuations that violate the balance, "blanked" by the system in the process of self-organization.
However, their specific spatio-temporal sequence leads to disaster. And it's not a classic bifurcation.
An illustrative example of such processes - the air disaster. A set of small events in a certain sequence is fatal.
Yet another problem. Something we believe the noise. Often this is information which is in the script according to claim 2. Example - a hologram. From the classical point of view it is a meaningless set of "bands". The image appears only when certain conditions are met "recovery" (as they say - the devil is hiding in the details).
The conclusion
We are on the threshold of a transition similar to the transition from photography to holography - the "plane" in the "space". This requires a uniting approach. And it does not get a simple arithmetic addition.
I say that reductionism does not become effective if we observe the processes in open systems, where a small uncontrolled exposure outside of our "field of view" can lead to "irrational" changes. Then the process becomes a probability, but it does not mean that God plays dice. We see it is because we are inside the system.
I am concerned the planar one-dimensional character of logical modeling as a method.
I feel closer to modeling based on the connection of everything with everything, that is, in other words, the model of the Universe as a hologram with a multidimensional kind of logic, beyond which there is something that is beyond any of our logic.
I couldn't agree with you more. A very beautiful overview.
The issue of pattern recognition is key to much of what we are talking about. At the observational level, humans as animals are particularly good at pattern recognition. No computer driven pattern recognition system, at least today, can do a better job. Even so, the patterns being recognized are human patterns, "all to human" so to say. This has been beautifully put forward by Thomas Nagel's essay, "What is it like to be a Bat?" in his book Mortal Questions. He makes a number of interesting points about what in means to be an organism with different attributes from that of humans: "...the fact that an organism has conscious experience at all means, basically, that there is implications about the form of the experience; there may even ...be implications aout the behavior of the organism. But fundamentally, an organism has conscious mental states if and only if there is something that it is like to be that organism--something it is like for the organism." The implications are: "Without some idea, therefore, of what the subjective character of experience is [here directly related to how it is to be that organism with all of its sense inputs and mind processesing that produces a consciousness], we cannot know what is required of physicalist theory." (167). This is no small problem since we are not interested in "any" pattern or theory but rather THE pattern that seems to fit what is out there that we are consubjectively trying to explain. This observation also applies to the use of scientific instruments that using computer programing, electronic circuits, and various kinds of display systems (virtual reality of a computer screen for example) in relationship to the icons used to present the data (attractive avtars or other symbols) relative to how the viewer can see them (eyes ability to pick-up certain scanning rates that allow us to see certain or behaviors) and relates what is seen unconsciously relative to how attractive the image is. This, of course, says nothing about whether the viewer understands the algorythm or visualizing programing that is being "trusted" to produce the pattern. The point of all of this is how limited our ability as scientists is to generate "the" pattern which may extend in very important ways beyond our own ability to know, and to interpret given certain subjective biases built into our instruments.
Page is his discussion, and many of the other comentators describe what is unique about the way the complexity sciences approach causality, patterns, etc.,clearly laying out the value of complexity methods and theory. Combineing this with your points about patterns as qualified by my suggestions, leads to an even higher level of complexity; that between the subject viewing and creating the pattern as a relationship to the object. My own research has gone in the direction of seeing all of these complex systems dynamically, to put them in motion so to speak. That is how each level has a different temporality (oscillator like for certain systems or temporal phenomenolgy as Being) that may account for even more interesting variations. De Archy Thompson was one of the first to look at this. Heterochrony relative to the patterning of form is another. Catastrophe theory and Chaos theory, as well as autonomus agent modeling (if temporalized) are ways to exploring such patterning. It sounds like your work is an important doorway into such analysis as well.
You can see what I've been up to at my company's web site: www.TimeStructures.com under "Time and Complexity." See what you think. We can go off line to: [email protected]
Interesting discussion, complexity science is more theory than science. Complexity theory has the aspects of emergence and self-organization that place complexity theory (science) as an epistemology that can be applied across unlimited research areas. In Bohm & Peat (1987) the discussion of complexity science is portrayed as an all-encompassing theoretical area. In Kauffman (1995) the presence of complexity theory (science) is used to explain a number of far reaching events. As an epistemology complexity theory (science) is like the metaphor you cannot see the forest for the trees, well complexity theory (science) is the forest and it explains why the trees exist and how they exist.
Bohm, D., & Peat, D. F. (1987). Science, order and creativity. London: Bantam.
Kauffman, S. A. (1995). At home in the universe: The search for the laws of selforganization and complexity. Oxford: Oxford University Press.
i believe it is not either / or holism vs reductionism. we need both. we get no where trying to find meaning in the whole or in just the parts. we find meaning at different levels, in different domains and we find truths, but we never find THE TRUTH. that is a fools game. we can build the world up and we can pull it apart and find meaning both ways. neither should have monopoly on what is true or knowable.
after going back and reading your various entries above i apologize for repeating much of what you've already articulated very well. one of my favorite authors / books on a related subject matter is robert laughlin's "a different universe: reiniventing physics from the bottom down". laughlin argues the modern reductionist world view has finished its work, and that it is time for the emergent world view to take the baton of science. I love Laughlin but i disagree that emergence must supplant reductionism. it isn't either or. it is each together and sometimes apart, dancing a delicate dance that will never end. The "emergence of everything" by by Harold Morowitz also fabulous intro to emergent world view. Tao of Physics by Frijof Capra also excellent. but my favorite synthesis is done by laurence cahoone in his paper titled "Towards A New Metaphysics of Natural Complexes." google it.
this paper blew my mind when i read it last year. Cahoone is coming out with a book on this subject later this year ( i believe). i was introduced to Cahoone through a course on the intellectual history of philosophy he taught for the Teaching company, which i absolutely loved. in his "New Metaphysics" Cahoone allows for us to understand non-physical systems as both REAL and natural. that is what we need. invisible but real systems include consciousness, society, and the so called "economy".
@Kees, what does it mean, according to you, "to explain consciousness". I mean in the context of your "view point". Is it enough to put it in the different perspectives ? My feeling is that there is no contradiction between explanations "bottom-top" and "top-bottom" approaches, rather a kind of complementarity. Of course knowledge of the rules of a game (chess for example) is not sufficient to know the gaining strategies and all the other subtlities. In science one accepts more and more explanations given by the complex simulation programs, which nevertheless need as input very detailed informations concerning the "elementary rules of the game" (for example QCD calculations on lattice). We then find sometimes some new properties which could be called emergent ones. Even in the case of pure mathematics we observe a shift of the paradigm that a proof of a conjecture can be only analytical, allowing now for computational methods. This is certainly an interesting evolution of the epistemological status of the complexity perspective (?). Thanks for your interesting contributions.
Kees and Adam. I'm really enjoying your discussion. A supportive amplification of Kees position from a slightly different direction regarding consciousness might be interesting. Even within the literature on consciousness, there are two basic unresolved, incompatible and exclusive universal approaches to consciousness that are commonly taken by consciousness researchers. One side takes the position that consciousness is akin to a universal force much like gravity with biological processes serving as a “receiver” that manifests it for the benefit of a particular organism. Here biological processes are analogues to a television receiver that transforms electromagnetic waves into particular representations or in our case “conscious” behaviors. To have a body is to have the means to have an embodied consciousness with variations in consciousness being tied to variations in embodiment (animal vs human) or of the completeness and arrangement of the parts (genius vs developmentally delayed). Of course, there is the question of the morphogenetic rules came into being that form this “receiver.” This view is more inclusive in that it includes mystical, transpersonal, near-death and other forms of consciousness in addition to the rational ones.Vimal describes and lists 40 distinguishable meanings of “consciousness” and considers the list to be incomplete.
Definitions of what consciousness means can be broadly divided into those that mainly refer to ‘function’ and those that are primarily about ‘experience’. A typical list of the functions of consciousness include: realism (dream and hallucination are excluded even though they seem to be a phenomena associated with consciousness), representation (poetry, art, and dance don’t seem to play a role as forms of conscious knowing), intentionality (spontaneous, automatic writing not included), information, control, memory and self (feelings, construction of values, suffering?). As almost all concepts of consciousness are theory-laden without much experimental support, Vimal regards the prospects for agreement leading to a significant reduction in their number as dim, but urges authors always to specify which meaning(s) of consciousness they have in mind.
William James’ common sense approach to what he calls the stream of consciousness is helpful. James’ points can be summarized as follows: (a) Conscious states of mind succeed each other; (b) every consciousness state is personal; (c) personal states of consciousness are always changing; (d) some parts of consciousness’ objects are attended to and others not; and (e) consciousness is personally sensibly continuous. Clearly, consciousness is fully embedded in a particular personal present. Consciousness also seems to involve ways of knowing that are personally embodied in muscles and other processes but do not result in a “known” in awareness. Consciousness involves a heterochronic relationship to, for example, a personal place and its milieu involving circumscribed relations of proximity, of envelopment, and the wild principal of logos (Merleau-Ponty). Consciousness neurologically integrates a number of senses into a personal qualitative experiences in a “present” as a chunk in a personal “flow”.
The incredible complexity of consciousness and its manefestations suggest that a reductions approach can trace a single element or two but not touch the whole. Ironically, the epistemological stance of the complexity sciences, while much broader, and as I noted afflicted by some of the same problems as the reductionist sciences, still excludes methods of direct investigation that lie outside of the scientific methods which itself is the result of political struggles over the scope of knowing (here I'm talking about Descartes). Such methods include art, mediation, visualization, etc., which have their own issues of validity in their process of "self-reflection." Even so, a quick review of the complexity literature shows that at least from my perusal of the literature, shows that it agrees with Kees in its epistemological range and capacity, including cross checking and validation from multiple methodological perspectives. By the way, this is my approach to public policy approach (not much mediatation and art though!).
Among many other points, I think complexity science can embrace arts and humanities. This is to follow what Christopher Alexander said that the order in Nature or in what we make and build is essentially the same (see the following illustration). In the recent paper, I have developed a mathematical model of wholeness that can be used to quantify degree of beauty in things or arts, e.g., the right snowflake is more beautiful than the left one.
Alexander C. (2002-2005), The Nature of Order: An essay on the art of building and the nature of the universe, Center for Environmental Structure: Berkeley, CA.
Jiang B. (2015, accepted), Wholeness as a hierarchical graph to capture the nature of space, International Journal of Geographical Information Science, xx(x), xx-xx, Preprint: http://arxiv.org/abs/1502.03554
Too many administrators, researchers and others ignore, cosciously or unconsciously, the complexity of issues because they want to get things done or move on to something else. We impose or look for a structure in order to simplify snd or streamline processes snd intetactions for our convenience and then compliment ourselves or we are complimented by others.