Thanks to recent research on how population levels, demographics, and the environment affect cultural evolution, the debate as to what gave rise to “modern human behaviour” has made substantial progress. Continuing archaeological investigations from various sites in South Africa, such as Blombos, Diepkloof, Sibudu, and Pinnacle Point, dating to the Middle Stone Age, seem to confirm the importance of such criteria by pushing the date when behavioural flexibility occurred closer to when anatomically modern humans first appeared. As a result, the relevance of neuro-cognitive factors as a means of determining the behavioural profile of anatomically modern humans has been challenged. However, as culture mainly concerns the manipulation and exchange of information according to context and as the brain is primarily an information processing organ, perhaps it is premature to discount the role of neuro-cognition to this debate. Neuro-cognition may therefore still be relevant in relation to providing the preconditions for culture and behavioural flexibility. Thus, by assimilating neurocognitive factors with population levels, demographics and the environment are we at last on the brink of resolving Renfrew’s “sapient paradox”?
You're entirely right to say that the insights coming out of neurocognitive study are greatly under-applied in analysis of human evolution. I was looking into this recently and found myself a little shocked at the lack of studies addressing neurocognition in early homo sapiens. It is clearly an integral key to defining what it is to be a "modern human" as well as to be able to track the transition toward that definition. To answer your question, we're some small steps closer to a resolution of the "sapient paradox", I'd say, but the fact is that the marriage of neurocognitive knowledge (the field itself still being fuzzy on many fronts, as well as fast-moving) with theories about human evolution and behavioral change is both young and neglected. I've only seen it really start to pick up within the last decade or so. I think that we can indeed become much closer to "resolution" if neurocognition is given a more prominent place in archaeological analysis of early modern humans. As it stands now, however, I don't expect resolution any time soon.
Thanks for your response, Christina. The optimism that neuroscience over the past decade would provide important insights into what makes us human has now been overshadowed by the realisation that the crucial factors probably concern social dynamics related to the number of individuals in a community of which culture provides the “glue” that helps keep overly large groups together. Evidence from Pinnacle Point, South Africa (see Curtis Marean and Ian Watts’ papers) suggest, as well as other behavioural markers, the use of ochre for ritual purposes as far back as 165,000 years ago that comes close to the first appearance of anatomically modern humans. Previously, it was surmised a single neurocognitive event led to the crucial difference at a much later period leading to the mismatch between the first appearance of anatomically modern humans and evidence of cultural artefacts i.e., Renfrew’s “sapient paradox”. However, the evidence now seems to suggest the neurocognitive substrates were probably already in place around 180,000 years ago, if not before, but expanded populations were required before they could be fully exploited for realising cultural identity. It is the nature of these substrates that provided the pre-conditions for realising cultural identity that are now at issue. I would suggest mirror neurons and theory of mind are particularly important together with a raft of other cognitive factors including working memory.
Indeed, working memory seems to be the main focus of the most recent neurocognitive theory I've seen that has been applied (however sparsely) to these questions.
I think the real problem with attempts to explain the sapien idea is that there is so little to distinguish behavior in the Middle Pleistocene and that at 100,000 or 50,000 years ago. The use of objects in sites as evidence of human behavior is confounded by comparative eithology as when we find bower birds collecting objects, arranging them, and building structures to "display" them. The teaching of tool making to young crows, and other animals' building structures adds to this as humans did not accomplish this on any permanent scale until the last 10,000 years. What demands explanation is why we had so large brains for a million years without any advance in technology that could equal the energy for such expensive tissue. Comparision with dolphin brains and their sensuous, caring behavior is a dissatisfying answer to most people, though one could say the such large brains have mainly been aimed at non-material ends, dances, songs, arts in general to the satisfaction of group integration. The problem with this is that we cannot differentiate then the explosion after 10,000 B.C.E. Wolpoff and his students have been trying to identify such a continuity recently. I make a case for a similar economic context in my new book, The Anthropology of Complex Economic Systems published by Lexington Books last fall.
I appreciate your observations Niccolo. However, it seems that what differentiates Homo sapiens from the other species you cite, and this applies to our closest non-human primate relatives, is the ability to pool ideas/resources that allows relatively complex tasks to be achieved e.g. hafted tools, the use of adhesives, etc. The absence of evidence for this in the archaeological record during the Middle Stone Age/Middle Palaeolithic may be due to taphonomic effects (where the archaeological signal becomes fainter over time). In fact, accumulating archaeological evidence from the Middle Stone Age in South Africa now suggests the existence of quite complex behaviour, including hafting and adhesives needed to exploit inter-tidal marine resources. It appears that one of the “costs” of the cognition that facilitated this flexibility relates to the ability to reflect (even over-reflect) through mental time travel on the fruits of one’s labour e.g. evidence for the use of particular hues of red ochre at Pinnacle Point that suggest ritual concerns. The activities of early hominins, such as Homo heidelbergensis may have been constrained by a more “passive” cognitive style and/or was mainly restricted to behaviour in the social domain, which, as Robin Dunbar has pointed out, requires a large brain. The crucial difference may, however, not reside in brain size alone but neural interconnectivity in, for example, the pyrimidal neurons in the granular prefrontal cortex (Elston et al 2006).
I think, as a human, you are overlooking the capacity of animals. I have written a chapter on how we often see ourselves as unique until we get the comparative bug, just like it was only "man the toolmaker" until Goodall proved different. See my chapter or my article in Evolutionary Anthropology (2005) https://www.researchgate.net/publication/236159978_Hair_Human_Evolution_and_the_Idea_of_Human_Uniqueness download free on Research Gate: On the "mental time travel" you might want to look at scrub jays, for example:
"Many animal behavior experts agreed with Dr. Tulving, even though they had not actually run experiments testing the idea. But when Nicola Clayton, a comparative psychologist, first heard about the claim, she had a different reaction. “I could feel my feathers ruffling,” said Dr. Clayton, who is now at the University of Cambridge. “I thought, hang on, that doesn’t make sense.”
Dr. Clayton began to test western scrub jays to see if they met any of the criteria for episodic memory. The jays can hide several thousand pieces of food each year and remember the location of each one. Dr. Clayton wondered if scrub jays simply remembered locations, or if they remembered the experience of hiding the food.
She ran an experiment using two kinds of food: moth larvae and peanuts. Scrub jays prefer larvae to peanuts while the larvae are still fresh. When the larvae are dead for a few hours, the jays prefer peanuts. Dr. Clayton gave the birds a chance to hide both kinds of food and then put them in another cage. She later returned the birds to their caches, in some cases after four hours and in other cases after five days."
See some of her other research on this:
Ostojic, L., Shaw, R. C., Cheke, L. G. & Clayton, N. S. (2013). Evidence suggesting that desire-state attribution may govern food sharing in Eurasian jays. Proceedings of the National Academy of. Science 1101, 4123-4128.
Shaw, R. C. & Clayton, N. S. (2013). Careful cachers and prying pilferers: Eurasian jays (Garrulus glandarius) limit auditory information available to competitors. Proceedings of the Royal Society 280 (1752), 1-7.
Cheke, L. C. & Clayton, N. S. (2012). Eurasian jays (Garrulus glandarius) overcome their current desires to anticipate two distinct future needs and plan for them appropriately. Biology Letters 8, 171-175.
Seed, A. M., Call, J. Emery, N. J. & Clayton, N. S. (2009). Chimpanzees solve the trap problem when the confound of tool-use is removed. Journal of Experimental Psychology: Animal Behavior Processes,35, 23-34.
Russell, J. Alexis, D. M. & Clayton, N. S. (2009). Episodic future thinking in 3- to 5- year-old-children: The ability to think of what will be needed from a different point of view. Cognition 114, 56-71.
Article Hair, Human Evolution, and the Idea of Human Uniqueness
I appreciate, Niccolo, that research with various species over the past two decades illustrates their cleverness. However, the crucial comparison is with our closest non-human primate relatives, Pan paniscus and Pan troglodytes. Though the cognitive gap separating these species with Homo sapiens has narrowed in line with the continuity model, there continues to be crucial differences concerning the ability to imitate in sophisticated ways that allows innovations discovered by one individual to be transmitted to other individuals in a group and then for the group to share and invest in this knowledge. Although higher primates are able to make and use basic tools using sticks or stones as hammers, this knowledge is transmitted in a happenstance way through emulation and signal enhancement. Napping a basic stone tool seems beyond the natural capacity of non-human primates and producing tools using shared knowledge that facilitates the making of composite tools appears to be restricted to larger brained hominines. This ability is facilitated not only by a more diffuse mirror neuron system but also theory of mind that provides four/five levels of intentionality when compared to just one/two at best in chimps. This also allows Homo sapiens to see the world from another person’s perspective with all this entails for cooperative endeavour, pooling of resources and the accumulation of cultural traits. Thus, it is the ability to transmit, share, build upon, and exploit information among individuals and groups that leads to niche construction through the ratchet effect. These processes do not seem to exist in other species including jays. We now seem to be seeing explicit evidence in the archaeological record for the above traits at least as far back as 165,000 years ago at Pinnacle Point in South Africa.
I think Derek you are engaging in a tautology here:
"This ability is facilitated not only by a more diffuse mirror neuron system but also theory of mind that provides four/five levels of intentionality when compared to just one/two at best in chimps. This also allows Homo sapiens to see the world from another person’s perspective.."
Other mammals have mirror cells, just what qualities are you suggesting because the literature on this is really thin. On the other hand what do you know that birds think and how their "perspective" is so qualitatively different and what does this really mean to explain the difference between Neolithic humans who produce complex society and are food producers, make war, build cities, etc where prior human populations did not. And, more importantly, as I point out in my book, why are complex societies so limited in human history? Ants grow food, have herds of aphids that they "milk" for sugar, they (some species) make war, take prisoners, have slaves. Someone once wrote an article in a journal arguing that humans were the only animals that kept pets. This is of course a question of definitions: what is a pet? Well, is it obvious that this is a problem of interpretation? that many animals have commensals, animals of other species that they allow to remain in their nests. They are usually complex society or eusocial animals. Is that not what pets are to us?
The idea that other animals cannot have the kind of experiences humans do was investigated systematically first over a century ago by Kohler in his Mentality of Apes (published in English, 1925), and recall that Goodall was chastised and her research at first condemned because she identified the emotions of the apes she worked with. Talking about brains is of concern as well since in terms of brain to body ratio we do not have the largest brain as some of the cetaceans do as well as the Dwarf monkey (a marmoset and poorly studied). Jerison's attempts to modify this with his formula is, to me, another aspect of human desire to be the best. Holloway's efforts as well. Neither modification can explain why we had huge brains for over a million years and it had little difference, unless you take the dolphin explanation: we were massaging our egos and engaged in sensuality.
I would reinforce the points made by Trevor. Population size and growth is extremely difficult to infer from archaeological materials and I am not convinced we have any robust measure of demography for the Palaeolithic at this point. Certainly recent attempts to count things (artefacts, sites, dates) as a proxy for numbers of people is rife with difficulty and while they are worthy attempts to say we have population curves is unrealistic. In the same way the nature of cultural systems in pre-sapiens is unresolved, and it seems clear archaeologists have consistently tried to draw a line that represents the onset of modernity, only to find earlier examples of any markers they employed. Clearly the materials that we use to designate modernity go back a long way, ruling out a recent revolution. But this still leaves the archaeological signature of 'modern' behaviour ill-defined. I have argued that part of the problem is that researchers often look for a universal marker(s), whereas it might be useful to think of defining modern behavior as flexible and without a universal set of behaviours. See:
https://www.researchgate.net/publication/264233770_Early_Old_World_migrations_of_Homo_sapiens_archaeology?ev=prf_pub
Chapter Early Old World migrations of Homo sapiens: archaeology
Good points, Peter. I will look for your chapter. Who is the author of the book?
Hello Niccolo,
The chapter is in Immanuel Ness (ed.) The Encyclopedia of Global Human Migration, Volume 1, Blackwell.
I also expand on the theme in a forthcoming chapter:
Hiscock, P. 2014 Cultural Diversification and the Global Dispersion of Homo sapiens: Lessons from Australia. In Y. Kaifu, M. Izuho, T. Goebel, H. Sato, and A. Ono (eds) Emergence and Diversity of Modern Human Behavior in Paleolithic Asia. Texas A&M University Press.
Trevor, the concept of “modern human behaviour” has been subject to substantial criticism since the discovery of the various artefacts from the Middle Stone Age of South Africa (Shea 2011) some dating back up to 165,000 years (Marean 2010), which is the reason the phrase is in quotes. This has mainly been based on a trait list of behavioural factors inferred from the artefacts recovered. Though the ability to indulge in symbolic behaviour has been proposed as a way of defining modern human behaviour, this also has come under criticism as the subtleties as to what should count as symbolic has not been sufficiently defined (semioticians are particularly vociferous on this point). Nevertheless, the fact that so many finds from South Africa suggest a measure of behavioural flexibility/variability was prevalent near to the point when anatomically modern humans appeared, suggests that this flexibility allowed populations to exploit resources in response to the austere and changing environmental conditions at the time. Population levels would have risen according to how successful foragers were in counteracting such conditions. It seems that exploitation of coastal resources not only allowed perilously low population levels to survive but also led to their expansion (Marean 2010) leading to further innovative technical strategies. As Richerson and Boyd (2005) have shown, population levels, innovation, and the prevailing environmental conditions are dynamically related. The exponential rise in population levels during the Neolithic was also probably associated with this dynamic but underwent intensification thanks to settled communities and an increase in specialization. The influence of population and demographics has been modelled in a number of scenarios see, for example, Henrich (2004), Shennan (2001), Powell et al (2009), and Derex (2013). Moreover, rather than a rigid trait list, it seems that behavioural variability is key to understanding Homo sapiens sapiens as a species (Shea 2011). As I have pointed out on a number of occasions, mirror neurons and associated systems, such as theory of mind, may have provided the conditions for this variability to take place and, in this sense, served as necessary cognitive precursors (Hodgson 2010, 2012a, 2012b). Thus, despite Peter Hiscock rightly criticising the concept of “modern behaviour”, I note that he agrees with the idea of behavioural variability as set out here. Peter also tends to play down the significance of the wealth of evidence that is now forthcoming from the Middle Stone Age South Africa that is suggestive of flexibility/variability.
To answer Niccolo’s point regarding mirror neurons, increasing evidence supports their existence in humans that are utilised in many more scenarios both internal and external to the brain than in non-human species (see, for example, Gallese et al 2011). I also note that Niccolo did not respond to my earlier point regarding the primacy of group co-operation for producing complex tools and technical innovations that were already extent in the Middle Stone age with the use of adhesives, traps, bows, arrows, composite tools, spears, heating technology, beads, necklaces etc., (De La Peñaet al 2013)—we do not find this in non-human species.
References
De La Peña, P., Wadley, L. and Lombard, M. (2013). Quartz Bifacial Points in The Howiesons Poort of Sibudu. South African Archaeological Bulletin. 68 (198): 119–136.
Derex, M., Beugin. M-P., Godelle, B. and Raymond, M. (2013). Experimental evidence for the influence of group size on cultural complexity. Nature. 389-393.
Gallese, V., Gernsbacher, M. A., Heyes, C., Hickok, G. and Iacoboni, M. 2011. Mirror Neuron Forum. Perspectives on Psychological Science. 6: 369-407.
Hodgson, D. 2010. Determining the Behavioural Profile of Early Modern Humans: Assimilating Population Dynamics and Cognitive Abilities. Before Farming. 2: pp. 1-8. Available from: http://www.waspress.co.uk/journals/ beforefarming/journal_20102/index.php.
Hodgson, D., 2012a. Cognitive evolution, population, transmission, and material culture. Biological Theory. 7, 237–46.
Hodgson, D., 2012b. Accommodating opposing perspectives in the ‘modern human behavior’ debate. Current Anthropology. 53(3), 358.
Henrich, J. (2004). Demography and cultural evolution: how adaptive cultural processes can produce maladaptive losses—the Tasmanian case. American Antiquity 69,197–214.
Marean, C. W. (2010).Pinnacle Point Cave 13B (Western Cape Province, South Africa) in context: The Cape Floral kingdom, shellfish, and modern human origins" Journal of Human Evolution 59 (3-4): 425–443.
Powell, A., Shennan. S. and Thomas, M. G. (2009). Late Pleistocene demography and the appearance of modern human behavior. Science. 324 (5932):1298–1301.
Richerson P. J. and Boyd, R. (2005). Not by genes alone: how culture transformed human evolution. Chicago: University of Chicago Press.
Shea, J. J. (2011). Homo sapiens is as Homo sapiens Was: Behavioral Variability versus “Behavioral Modernity” in Paleolithic Archaeology. Current Anthropology 52 (1), 1-35.
Shennan, S. (2001). Demography and cultural innovation: a model and its implications for the emergence of modern human culture. Cambridge Archaeological Journal 11: 5–16.
Watts, I. (2010). The pigments from Pinnacle Point Cave 13B, Western Cape, South Africa. Journal of Human Evolution. 59 (3–4) 392–411.
Hello Derek,
I agree with you about the "significance of the wealth of evidence that is now forthcoming from the Middle Stone Age South Africa that is suggestive of flexibility/variability". My point is really that elements of such flexibility may also be present earlier in the African record. Sue O'Connor and I made this point about microliths and several other researchers advocating gradualist models have explored the issue. The pre-HP patterns and late ESA patterns are somewhat less expansively described compared to the HP but there are a lot of hints that elaborate and flexible behaviours may have been in place. The significance is again that the identification of modernity is not straight forward and many traits need not be simply associated with biological taxa.
See
Hiscock, P. and S. O’Connor 2006 An Australian perspective on modern behaviour and artefact assemblages, Before Farming [online version] 2006/2 article 4
Dear Derek:
I certainly did answer your assertions, however, you have ignored my citations. The idea of human uniqueness is an ingrained anthropocentric idea that has lasted millenia. The research you are citing uses many of the experiments cited in Hill, Kim; Barton, Michael & Hurtado, Magdalena, “The emergence of human uniqueness: characters underlying behavioral modernity,” Evolutionary Anthropology, v. 18, n. 4, 2009:187-200. Here the strategy of the researchers is to create experiments that test their own assumptions about human abilities based largely on the limitations of human physiology. Then when the animals cannot perform in ways dictated by human physiology they claim the behaviors are unique to humans!!! Self-fulfilling at best. As I said I address this issue in a number of publications, not only the one cited in Evolutionary Anthropology in my post above but also In: Primatology: Theories, Methods and Research ISBN: 978-1-60741-852-8 Ed: Emil Potocki and Juliusz Krasinski © 2009 Nova Science Publishers, Inc. Chapter 6 THE TENDENCY TO MAKE MAN AN EXCEPTION Niccolo Caldararo. The central problem with the "complex" behavior noted in your Middle Stone Age citations is that we are giving value to certain kinds of features (scratches on bird eggs, bones, etc) when we really have no idea whet they mean. It is much like Binford's questions about what anthropologists really believe early humans do that is what we think "human" means (see Binford, L., Bones: Ancient Men and Modern Myths , 1981). We can chart the history of ideas of the "creative explosion" concept that Richard Klein has focused on (2002) or Milford Wolpoff's essays on what it means to be human see his chapter here on RG https://www.researchgate.net/publication/226741817_Neandertals_and_the_Roots_of_Human_Recency. As for Pinnacle Point the only human remains are from the youngest levels but the inferences of humanness are still rather stilted in the report. How do you claim to combine these remains into a cultural complex that identifies what is human? As for tool using in a social context we have known ants do this for more than 30 years (Tool Use by the Ant, Novomessor albisetosus (Mayr)
Philip McDonald
Journal of the New York Entomological Society, Vol. 92, No. 2 (Apr., 1984), pp. 156-161).
Chapter Neandertals and the Roots of Human Recency
Dear Peter,
I don’t disagree that signs of behavioural flexibility may be present in the archaeological record before the arrival of anatomical modern humans, as Mcbrearty and Brooks (2000) suggested some time ago. The Schöningen spears provide but one clue that this was the case. I myself am inclined to the view that we have tended to underestimate the ability of early hominins such as Homo heidelbergensis. The problem is that archaeologists require a high threshold before finds are accepted as indicating a particular behavioural trait and, in this respect, the evidence of pre-human abilities continues to be controversial. In contrast, the South African evidence from the Middle Stone Age has received more universal acceptance thus prompting my question. There must, however, have been some skills/abilities that already existed in pre-human species that served as precursors to the more flexible ones suggested by the South African sites. In this regard, I proposed (Hodgson 2000) that abilities before the Middle Stone Age/Middle Palaeolithic were developing at different rates according to prevailing circumstances but which were ultimately constrained by existing cognitive abilities.
Hodgson. D. 2000. Art, Perception and Information Processing: An Evolutionary Perspective. Rock Art Research. 17 (1): 3-34
McBrearty, S. and Brooks, A. S. J. 2000 The revolution that wasn't: a new interpretation of the origin of modern human behavior. Journal of Human Evolution. 39, 453–563.
i think it is a very great puzzle why human intelligence and capacity to cumulatively create complex cultural adaptations so recently. We have come to be the earth's dominant species because of these capacities. Yet all the other "killer adaptations" (internal skeletons, camera style eyes, powered flight, etc.) evolved long ago. Why not big brains too? I've speculated, underline speculated, that one possibility is that the highly variable environments of the Pleistocene selected for large brains and the cultural systems they support. You can find an essay about this hypothesis on my web page: http://www.des.ucdavis.edu/faculty/Richerson/Climatewasstrange.pdf. Some evidence is consistent with it. For example, human brain size and cultural sophistication increased considerably over the course of the Pleistocene, and the increase is correlated with an increase in the millennial and submillennial scale climate variation over the last few glacial-interglacial cycles for which we have good core data.
Dear Peter (Richerson),
In your paper you seem to imply that the Upper Palaeolithic artefacts are more complex than those from the Middle Stone Age of South Africa. In fact, the finds from South Africa now seem to blur the boundaries between these traditions as the archaeology from South Africa indicates e.g., the use of fire to provide heat in underground sand “kilns” to produce hardened cores for knapping tools 162,000 years ago, which requires complex knowledge of precise temperatures (Brown et al 2009); the exploitation of adhesives and the evidence for bow and arrow technology (Lombard 2006,2011); shell palettes for mixing ochre with fat dating to 100,000 years ago (Henshilwood et al (2011); advanced projectile weapons employing microliths that were launched with spear throwers 71,000 years ago (Brown et al 2011) etc. These finds suggest that well before 50,000 years early modern humans already possessed the necessary cognitive faculties to promote survival. These and similar finds seem to indicate that cognitive flexibility to cope with changing climatic conditions existed when anatomically modern humans first appeared. The depiction of animals during the European Upper Palaeolithic, although spectacular, needs to be seen in context in that even some indigenous groups up to the recent past did not have a tradition of representational art. The evidence from South Africa, therefore, seems to stacking up in favour of the idea that the “sapient paradox” is well on the way to being resolved.
With regard to the influence of climate, it is notable that a number of archaeologists have cited climatic change as influential in driving forward cultural/technological innovations particularly in relation to the Toba eruption around 70,000 years ago and the increasing dry and colder conditions from 190,000 to 125,000 years ago (MIS 6). Thus, the reliance on socio-cultural criteria may well have been operating close to when anatomically modern humans first appeared but this was hesitant due to the sparse and fluctuating population levels.
With regard to brain size, though climatic conditions may have been important, social group size has also been a predictor of primate brain size (Dunbar 1992) that may have also led to a reorganisation of certain structures in the neocortex. As a large brain is expensive to maintain, it seems to have reached its optimum size (it has, in fact, shrunk somewhat over the past 40,000 years) but has transcended this limitation through the “invention” of culture (Hodgson 2000), which, as you have rightly and repeatedly pointed out, has allowed a rapid response to changing environmental conditions than would otherwise have been the case.
Brown et al. 2009. Fire as an Engineering Tool of Early Modern Humans. Science. 325: 859-862.
Brown et al. 2011. An early and enduring advanced technology originating 71,000 years ago in South Africa. Nature. 491,Pages:590–593.
Dunbar, R. I. M. 1996. Grooming, gossip and the evolution of language. Cambridge: Harvard University Press.
Henshilwood, C. S. et al. 2011. South Africa A 100,000-Year-Old Ochre-Processing Workshop at Blombos Cave. Science. 334:219-222.
Lombard, M. 2006. Direct evidence for the use of ochre in the hafting technology of Middle Stone Age tools from Sibudu Cave. Southern African Humanities. 18 (1): 57–67.
Lombard, M. 2011. Quartz tipped arrows older than 60ka: further use-trace evidence from Sibudu Cave. Journal of Archaeological Science. 38:1918-1930.
Dear Derek,
Thanks for the comments. I'm well aware of the South African material recovered by Henshilwood, Marean, and others. It is odd the tool traditions of greater complexity seem to spring up irregularly and then disappear in S Africa (Jacobs et al.). I've speculated about what this might mean. Allison Brooks told me a couple of years ago that she has evidence for a continuous UP-like industrial tradition in Central Africa. She attributes the spotty record in S Africa as the product of recurrent extreme droughts. The last time I looked she had not published anything on this. Clearly, no other Old World region can match Europe in the person years devoted to hunting for Paleolithic material and we remain prisoners of ignorance about much of what might have gone on in Africa and Asia in the late Paleolithic.
I certainly think that population size is very important but I'm a little skeptical of Robin's argument. At least ethnographic hunter-gatherers typically lived in much larger ethno-linguistic groups, often called tribes, than the "Dunbar Number" of 125 or so.
One-off events like the Toba eruption are hard to deal with. MIS 6 was pretty much like the glacial before it. Those two had more abrupt variation than more ancient ones, but not as much as MIS 2-4. Admittedly the correlation between brain size and abrupt change can be only partially tested with the available long high res cores, but the relationship in my figure is pretty impressive i think.
Human brains may have shrunk in the Holocene. I think it is a fun fact and ought to be true. But the sample of UP skulls is rather small. Also UP people were probably larger overall that Holocene folks. You'd have to do the allometric correction right. Unless I've missed recent work on the subject, I think the evidence for brain size reduction is thin.
People talk about brain reorganization being important, but I'm skeptical about that too. The brain evolutionist Georg Striedter argues that all vertebrate brains are scale models of one another. He also argues that brains have a sort of law of mass action, bigger parts send axons out in proportion to size. The large human forebrain sends axons right down our spinal column to monitor our fingers and toes. hence we find it easy to learn intricate manual skills and dance steps. Simon Reader and Kevin Laland have a nice comparative primatology paper showing that forebrain size, social learning and innovativness are part of the same package. If brain size is largely what is evolving, we know that selection can generally move traits along quite rapidly on the geological time scale. Reorganization seems to be invoked to account for the relatively slow increase in brain size. The competing hypothesis is that brains are selected to be as small as possible because of their high metabolic cost and that they only grow large when the functions they serve become very important.
Best, Pete
Jacobs, Z., Roberts, R. G., Galbraith, R. F., Deacon, H. J., Grün, R., Mackay, A., . . . Wadley, L. (2008). Ages for the Middle Stone Age of Southern Africa: Implications for human behavior and dispersal. Science, 322, 733-735.
Reader, S. M., & Laland, K. N. (2002). Social intelligence, innovation, and enhanced brain size in primates. Proceedings of the National Academy of Sciences USA, 99, 4436-4441.
Richerson, P. J., Boyd, R., & Bettinger, R. L. (2009). Cultural innovations and demographic change. Human Biology, 81, 211-235.
Striedter, G. F. (2005). Principles of Brain Evolution. Sunderland MA: Sinauer.
Dear Peter and other contributors,
Ample evidence has accumulated over the recent past attesting to the reorganisation of the human brain when compared to nonhuman primates and early hominins. For example, Holloway (2014) found displacement of the lunate sulcus in australopithecines suggesting reorganisation occurred prior to brain expansion, Orban (2011) found differences in certain parts of the brain of nonhuman primates and humans for the processing of 3D shape from motion, area 3A in early visual cortex has undergone reorganisation (Tootell et al 1997), parts of the intraparietal sulcus exist in humans with no corresponding area in nonhuman primates (Vanduffel 2002), temporal cortex has expanded in later hominins (Rilling 2002), the arcuate fasciculus is more profuse (Ghazanfar et al 2008), white matter is more concentrated in certain areas (Schenker et al 2005), the parietal area seems to have expanded in humans compared to Neanderthals (Bruner 2010) and so forth (see Hodgson 2012 for a consideration of how brain reorganization relates to tool use in the context of the archaeological record). Interestingly, the realization of a more complex brain has led to a need for a longer developmental trajectory in humans than in other primates (neoteny) due to the need for the brain to absorb information from socio-cultural input. In this respect, human children experience a much longer period of vulnerability compared to our closest relatives with the brain not fully matured until the end of adolescence or early adulthood. Dental eruption, which is an indicator of developmental timing, shows that pre-Homo sapiens experienced a faster rate than Homo sapiens (Rozzi and de Castro 2004). This all goes to show that perhaps around 500, 000 years ago socio-cultural factors were beginning to take affect that were reflected in brain reorganization and a slower developmental scenario. Yet, it was only with the arrival of anatomically modern humans that socio-cultural factors came to fully dominate.
With regard to Renfrew’s “sapient paradox”, although the evidence from South Africa seems to suggest flexible behavior was prevalent close to when anatomically modern humans appeared, perhaps we will have to wait for additional evidence from this location for confirmation. I believe Fisher, Marean and others are, at the moment, carrying out research in South Africa to establish this (Fisher 2013).
References.
Bruner, E. 2010. Morphological Differences in the Parietal Lobes within the Human Genus: A Neurofunctional Perspective. Current Anthropology. 51 (Suppl. S): S77-88.
Fisher et al. 2013. Archaeological Reconnaissance for Middle Stone Age Sites Along the Pondoland Coast, South Africa PaleoAnthropology 2013: 104−137. doi:10.4207/PA.2013.ART82
Ghazanfar, et al. 2008. language evolution: neural differences that make a difference. 11 (4): 382-384.
Hodgson, D. 2010. HomInIn tool productIon, neural integratIon and the social braIn. Human Origins.1: 41-64. https://humanorigins.soton.ac.uk/
Holloway, R. 2014. Paleoneurology Resurgent. In, Paleoneurology, E. Bruner (ed.) Springer. pp. 1-10.
Orban, G.A. 2011. TheExtraction of 3D Shape in the Visual System of Human and Nonhuman Primates. Annual Review of Neuroscience. 34:361–88.
Rilling, J. K. and R. A. Seligman. 2002. A quantitative morphometric comparative analysis of the primate temporal lobe. Journal of Human Evolution. 42: 505-533.
Rossi, F. V. R. and de Castro, J. M. B. 2004. Surprisingly rapid growth in Neanderthals. Nature 428|29
Schenker et al 2005. Neural connectivity and neural substrates of cognition in hominiods. Journal of Human Evolution. 49: 547-569
Tootell, R. B. H., Mendola, J. D., Hadjikhana, P. J., Ledden, Liu, J. B., A. K., Reppas, J. B., Sereno, M. I. and Dale, A. M. 1997. Functional analysis of V3A and related areas in human visual cortex. The Journal of Neuroscience. 17 (18), 7060-7078.
Vanduffel, W., D. Fize, H. Peuskens, K. Denys, S. Sunaert, J. T. Todd, and G. A. Orban. 2002. Extracting 3D from motion: differences in human and monkey intraparietal cortex. Science. 298: 413-415.
Derek, thank you for stating the current state of cognitive archaeology clearly. From my work on Ice Age rock art, and Younger Dryas formalised art, buildings, and building sites (Gobekli Tepe), I agree with the view that our current wiring is the definition of human, and that the early South African (and contemporary European, see D'Errico) sites have the full human repertoire minus population density and minus some materials such as metal.
Thus there was no neuro-cognitive change within the human era. Species may b e as periodic as elements (as the late prof Jan Bojiens said at an Evolution Opne Day at Wits about two years ago).. Thus there is no cultural evolution, only changes in material and styles (the latter keep on changing, and is inherently meaningless).
I fail to see how supposed neuro-cognitive change could "factor with population levels, demographics and the environment". Some moden and recent populations were living as precarious as some Stone Age people; and some Old Stone Age people had some luxury, including schools (see lithic refitting at Taramsa Hill near Dendera).
I will study the sources you provide. You could see an introduction to my work, based in South Africa by using worldwide data, at www.mindprintart.wordpress.com.
I posted an update on relevant issues emerging from the new hominid fossils from Rising Star cave, on www.stoneprintjournal.com
Derek,
(1) What was Renfrew's paradox? Is it still relevant?
(2) The evolution of symbolic behavior (the archaeologist term), palaeoart (rock art term), religious behavior (my term) is at least two million years old. See my articles, databases, etc. Perhaps even older if we triangulate the question with chimpanzee religious behavior and other mammalian trans-species religious behavior.
(3) How about adding my published and presented papers to your bibliography?
Best regards
Hi Derek,
Kim Sterelny and myself have a paper in press in Biological Theory that comments on the concept of the paradox. Its not available yet but will be shortly, it is in the last volume of 2017. cheers
Peter
Hi Peter,
Thanks for letting me know about your upcoming paper, which I look forward to reading when published.
Derek
Edmund,
I think we need to consider cumulative culture and the "ratchet effect" and also the fact that archaeologist tend to require considerable contextual evidence from the archaeological record before they take on board a particular hypothesis.
Derek
James,
Perhaps "Renfrew's sapient paradox" is even more pertinent now that recent discoveries from Morocco indicate that the human lineage goes back as far as 300,000 years.
Derek
Derek, the cultural record is cumulative, due to more loss of earlier material; and due to lower population and less earlier material, and fewer materials. Human sciences often confuse sample size, and sample quality. Cultural media, and repertoire, are not cumulative, as Gobekli Tepe BC 9000, and Blombos and Border cave BC 70 000 demonstrate.
The only ratchets are glues for hafting, copper, bronze, and iron. As we know from the Bronze to Iron transition, there were no changes in culture or cognition, only changes in the scale and speed and proliferation of things. Not of thoughts or behaviour.
The hypothesis of cultural evolution had become a paradigm and axiom. It is unscientific, and it is re-applied unscientifically. Abuse of the word 'evolve' is as rife in science as in popular culture.
Edmund,
It has been found that the human brain, due to its flexibility, is prone to being "reformatted" or "reformulated" due to engagement with artifacts, material systems, and information processing capabilities invented by humans. For example, the word form area seems to have co-oped the evolutionary instantiated face area of the fusiform gyrus in order to process writing; that latter which is not instantiated as such but has to be learned. Thus we become entwined by the artificial systems that we create as a result of cultural input.
Derek
I agree with you, Derek, each human is a complex system. If you look at the complexity at various scales, it's even more mind boggling. But coarse patterning is there and I'm struggling with how to approach the problem myself. Symbolizing and "sapienism" must have begun long before 300,000 years ago.
Derek, thank you for the clearest summary of the case for cultural evolution that I have ever seen. Several problems are apparent, such as linguistic terminology and model, recalling the eternal split between nature and nurture models, some mis-application of Jacobsen, and under-development of Chomsky in the humanities. all species, including their neurology, are formatted by environment including material, and resulting species faculties. mongoose have a range of cries, thus they are entangled also with their faculties.
The summary describes music as well as language. Several species have superfluous compulsive use of faculties. We are entangled with music, part of being prone to patterns. Use of faculties, and engaging with natural structure, do not indicate invention. Various scales were not invented somewhere in Greece at 10am one morning.
Faculties are molded to parts of the natural spectra, thus perception and behaviour re-express structure. We did not invent anything. Sound allows a thick layer of optionality, that has to be learned, which we use to form economic groups, and to exclude other groups (which the EU is finding a stumbling block to the next phase of economic maturity).
Thus social behaviour is a more fertile field for behavioural changes or supposed 'evolution' than media such as language. Species are also single organisms. Look at ants and gain wisdom. 'Invention' or rather the economic and technological maturity curve, are collective functions.
Behaviourism has had several schools based on linguistics, and on computation models. Brains as single processors? With different software? Constructs of artificial intelligence is itself an artifice.
Juliet,
Perhaps the human brain has always had the potential for outsourcing inherent modes of cognizance to be expressed in material form but needed the input that comes from raised population levels before this could occur. The complexity to which you refer may be a result of the dynamic interaction between artifacts, information systems, neural networks and innovation that arises from raised more stable population levels. The fact that cultural complexity seems to have waxed and waned since the arrival of the fully fledged human brain suggests that the points of reference underlying this dynamic underwent considerable variability from the outset.
Derek,
I now have read Renfrew 2008 "Neuroscience, evolution and the sapient paradox". To paraphrase the 'paradox' is that the 'biological basis' of the species Homo sapiens sapiens was established perhaps 200,000 years ago and at least by 'out-of-Africa 60,000 years ago', whereas these humans only 'transformed the world' beginning with the 'Sedentary Revolution around 100,00 years ago' ('urbanism, ceramics, domestication of plants and animals, personal and heritable property, literacy, metal-working, literacy'. Renfrew focuses on two behaviors: (a) gold or monetary measures of 'intrinsic value' for exchange, the 'notion of value itself' and 'equivalency'; (b) material sacred objects and associated religious beliefs and practices, for which 'we have no evidence for them at the date of out-of-Africa dispersals'.
I suggest that there is no paradox, except if one selectively only considers human culture to be things in his ad hoc list, The selective list and its two foci reflect the dominant interest of UK Senators, that is money (and property) and religious and political institutions to protect them and 'transforming the world' (including but unmentioned Eurocentric colonialism). Most of the points made in this paper are simply erroneous or reflect ignorance of both neuroscience, archaeology, linguistics and religious studies on the topic of contextual evidence for hominin including Homo sapiens sapiens evolution.
I will only here comment on his two special foci. With respect to the notion of value itself and equivalency, hominins as well as other species (!) appear to have such a notion. See for example the paper open for comment by De Cruz, with its short overview on the question of numerosity in human and other species and moral and mathematical realism https://www.academia.edu/s/d735bd435a/animal-cognition-species-invariantism-and-mathematical-realism
I have presented a detailed lit review of numerosity neuroimaging coordinates and mapped them onto the Homo habilis (Oldowan) brain expansion and reorganization areas https://www.academia.edu/32447099/Symbolic_Behavior_Palaeoart_at_Two_Million_Years_Ago_The_Olduvai_Gorge_FLK_North_Pecked_Cobble_--_The_Earliest_Artwork_in_Human_Evolution_110_slides_2013_IFRAO_International_Rock_Art_Congress_2013_May_30_2013_Albuquerque_NM_USA_Session_Archaeology_and_the_science_of_rock_art
Rather than Renfrew's conservative view of 'value', it was a major evolutinary step of the Habilines, around 2 million years ago, to develop land and marine meat-sharing essential for brain expansion. Prior to this, at least based on chimpanzee behavior, males fought violently over prey meat carcasses tearing them to pieces. Mary Leakey described the graphic design features of the Olduvai pecked cobble 1.8 MYA, and to my mind (biassed by modernity) it in part consists of a graphic representation of comparison of sets for equivalency with respect to quantity (4 versus 2) and size/amount (large vs small). Oldowan hominins were capable of intellectually, so to speak, playing with the idea (materially represented) of value equivalence and equity. (Renfrew might object that I am a 'liberal'?)
The idea that gold was the first equivalence value seems also contradicted by all hunter-gatherer exchanges systems in which special commodities (for example, stone axes) are a measure of exchange value.
As for the 'sacred', I have published on chimpanzee religious behaviors, in part to triangulate that the hominin lineage must also have had religious behaviors over last several million years or more. I would accept that the two components of six basic components of religious behavior, namely the 'sacred' and 'sacrifice' seem to be unique to the hominin lineage. 'Veneration' as mentioned by Renfrew has precursors in Chimpanzee religion and will have such in our hominin lineage as well. Rather than 10,000 years ago, religious behaviors, which based on neuroscience, are part of our biological 'hard-wired' brain, are, to repeat, 2 million years old or earlier.
In short, there's only a paradox if one overlooks the last millions of years, and thinks 'civilization' began 10,000 years ago.
I have come to the conclusion which I will put out in a paper soon, that the thing which humans excell at and which explains practically al out sapient behavior is the abilit to determone causation. A;; pther animals work on correlation.
James,
By your statement "whereas these humans only 'transformed the world' beginning with the 'Sedentary Revolution around 100,00 years ago' ('urbanism, ceramics, domestication of plants and animals, personal and heritable property, literacy, metal-working, literacy' ", I assume you mean 10,000 years ago and not 100,000 years ago. As far as religious capacities are concerned, by which I infer you mean magical or supernatural thinking (religion usually means a hierarchical institutional system), this may only have had the potential to appear with the arrival of Homo sapiens thanks to an advanced theory of mind .
Derek, here is the key couple of paragraphs from the Biological Theory paper I mentioned.
"We do not think that anything in cognitive science shows that we can simply “read-off” hominin cognition from behaviour or from artefacts. Even if skilled behaviour is not driven by explicit representation in semantic memory, we still need a positive account of implicit representation and its interaction with explicit information (which often plays an important role in teaching and error diagnosis). One of the puzzling features of human evolution is the lack of any clear correlation between the appearance of new hominin species and marked changes in the technical, ecological, and social lives of hominins (with the possible exception of the correlation between the evolution of erectus and Acheulian technology). This lack of correlation has been much discussed with respect to our species. The supposed problem is that Anatomically Modern Humans appeared in the historical record hundreds of thousands of years before Behaviourally Modern Humans. Indeed, on a recent estimate our species has existed for about 300k years (Hublin, Ben-Ncer et al. 2017), while most discussions place the emergence of Behaviourally Modern Humans within the last 100k years (Henshilwood and Marean 2003). If this is so then for more than half of our biological existence, the technical competence, ecological role and social lives of members of our species seemed akin to their Middle Stone Age contemporaries and ancestors rather than to their conspecific descendant. From about 50 kya, after a long and patchy transition period, our forager ancestors’ social, economic, and technical lives fell into the range of variation of historically known foragers. Yet those ancient humans are of the very same species as us. So why did it take so long for the Modern Mind to join the Modern Body? This is Renfrew’s “sapient paradox” (see e.g. Renfrew 2008) however if either niche construction approaches or 4E views of cognition are on the right track there is no reason to expect innovations in material technology, foraging, or social life to be correlated with speciation patterns. Reasons for this include (i) the cognitive capacities of individuals do not depend solely on their individual phenotypes, let alone on their neocortex, for they depend on their access to external supports as well; and (ii) the informational resources of communities are not a simple reflection of the cognitive capacities of individuals in those communities (and it is often communities, through collective action, that leave material traces). How the community is networked, and the patterns of informational cooperation (and lack of cooperation) are equally important.
As a consequence, there is no paradox in the “sapient paradox”. This makes the methodological problems more acute for cognitive archaeology. Those who emphasise the cognitive importance of niche construction also think that cumulative cultural evolution impacts on how we think, not just what we think; the defenders of extended and distributed approaches to cognition converge on the conclusion that there can be very significant cognitive differences within species. This potential variation in human actions and hence capacities stems from multiple factors: cognitive capacities depend not just on genetic endowment, but also on the material and social supports that scaffold their cognition. There is no reason at all to suppose that (for instance) norms of teaching and information sharing, or the availability of material supports for cognition, are roughly constant across genetically similar communities that are nonetheless widely separated in space and time. If all of that is right, we cannot, for instance, aggregate data on material culture and economic lives across Neanderthal sites to form a composite picture of “the Neanderthal Mind” (Wynn and Coolidge 2004, Wynn, Overmann et al. 2016). Even idealising away from genetic evolution in the Neanderthal lineage, there is no “the Neanderthal Mind”. At best, there is a spectrum of variation, a space of possible Neanderthal Minds. There is a certain reluctance amongst the archaeological sympathisers of 4E cognition to embrace this consequence. Thus John Gowlett, Clive Gamble and Robin Dunbar sign up to many of its signature ideas, while wanting to hang onto the idea that relative neocortex volume constrains social complexity (indexed as maximum group size) (Gamble, Dunbar et al. 2014). The seriousness of this challenge depends on how profoundly these extra-somatic factors influence cognition, and on when those factors began both to be important in hominin evolution and vary from community to community. Great ape intelligence may be embodied, but it is presumably neither extended nor embedded."
Peter Hiscock,
Excellent comment. I agree pretty much totally with what you say and questions you raise.
I am currently revising my Supplementary File to Harrod (2014), I posted a revision a few days ago, but now am revising it again. This in response to the new Irhoud dates as well as Van Peer's Sangoan out-of-Africa hypothesis. In my earlier Supp File I had suggested Lugbara of Uganda, originally from Sudan, as a proxy for genetics and mythology out-of-Africa to Daramulun mytheme in SE Australia. I long puzzled over this as whether independent convergence or diffusion. The new Irhoud dates suggest actually diffusion out-of-Africa by robust Homo sapiens sapiens with a Sangoan industry. I hope to post this in-progress revision this weekend or next week.
Best regards
In contrast with any other animal humans are in sole posession of the ability to determine causation,. whether applied to tool manufacture or understanding the natural world. This ability allows humans to bring a common mental template to virtually any set of circumstances. Other animals may excell at correlation and while humans also retain this ability, it is the determination of causation which allows humans to both predict and retrodict. The evolutionary development of this ability is observable in the production of artifacts which increasingly relied on more precise prediction and achievement of outcomes which we refwer to as skill.
Peter,
Thanks for the preview of your paper. I published a paper some years ago in Biological Theory (Hodgson 2013) stating that some scholars, such as Shennan, Richardson, Boyd, Henrich, and Powell discount the brain as having anything important to say about cultural processes, while others (some of which you mention) continue to claim that neural factors as realized in cognition continue to be relevant. In my paper, I also pointed out that the flexibility that we see in human cognition and cultural diversity, nevertheless, still needs to be explained. For example, the fact that skills can be lost as well as gained may be a result of the sensitivity of mirror neurons (and the associated system facilitating theory of mind) to population densities/rates and culturally embodied material knowledge. Here is the abstract of the paper:
“There has been much debate regarding when modern human cognition arose. It was previously thought that the technocomplexes and artifacts associated with a particular timeframe during the Upper Paleolithic could provide a proxy for identifying the signature of modern cognition. It now appears that this approach has underestimated the complexity of human behavior on a number of different levels. As the artifacts, once thought to be confined to Europe 40,000 years ago onwards, can now be found in other parts of the world well before this date, especially in South Africa, this suggests that modern cognition arose well before this period. Moreover, the variability of the archaeological record from the time when anatomically modern humans appeared 200,000 years ago suggests cognitive factors alone are unable to explain the obvious unevenness. In this article, it will be demonstrated how neuro-cognition can be assimilated with population dynamics and the transmission of information between individuals and groups that can provide important insights as to the nature and origins of modern human cognition.”
In essence, there is a paradox in Renfrew’s sapient paradox in that there is, in the last analysis, no such paradox for some of the reasons that I mention in the above paper and some of the reasons expressed in your extracts.
Hodgson, D. 2013. Cognitive Evolution, Population, Transmission, and Material Culture. Biological Theory. 7: 237-246. http://link.springer.com/article/10.1007/s13752-012-0074-y
Thanks Derek. There is much in there. Can I start with one question? Why do specific kinds of artefacts indicate modern cognition? There are many factors that can lead to the emergence of certain kinds of craft activities, thereby creating artefacts. A long while ago I made that point that microliths in Africa appear to cycle in and out of the record over more than 300,000 years. If these indicate modern cognition does that mean it was lost and re-established multiple times? I think it unlikely that artefact forms are good proxies for cognitive capacity or brain structure.
Peter
I agree fully that artifact forms are not good proxies for cognitive capacity or brain structure development in course of human evolution of culture. From my review of neuroscience brainimaging studies I map the neural substrate networks for toolmaking, artmaking and numerosity layering my hypothesis for the networks on top of a Tobias illustration of anatomical areas of a Homo habilis brain expansion areas. It is artmaking that activates the most extensive network, then numerosity, and much less for toolmaking. Toolmaking at least in the early Stout studies does not activate Broca's area for language, so all the studies hypothesizing tools and language co-evolve in parallel appear falsified. Furthermore, it is artmaking that activates the Frontal Pole which is the hub for advanced intellectual abilities, such as doing Raven's matrices. All this I presented in my IRAC Albuquerque 2013 presentation (online in 110 slide version)https://www.researchgate.net/publication/316349261_Symbolic_Behavior_Palaeoart_at_Two_Million_Years_Ago_The_Olduvai_Gorge_FLK_North_Pecked_Cobble_--_The_Earliest_Artwork_in_Human_Evolution_110_slides_IFRAO_International_Rock_Art_Congress_2013_Albuquer
Conference Paper Symbolic Behavior (Palaeoart) at Two Million Years Ago: The ...
We might make some progress here by utilising the concept of the extended mind/brain whereby, through embodied processes, the mind, through social engagement in humanly made material objects, became a distributed commodity. By this I mean that materials became a shared human commodity available to the community as a whole. By changing the material parameters, so the human brain, thanks to its inherent plasticity, was able to reformat the neural systems for purposes other than originally devised. Thus, the human brain seems to consist not just of modular sub-systems but transmodal co-ordinates that facilitate the retuning of the basic neuro-functions. By producing increasingly complex artefacts of various kinds, humans were not only able to retune the neural networks but also supplement them with “externally” derived materially embodied, but communally derived, knowledge that allowed niche construction to proceed. It seems that population levels and densities needed to expand beyond natural hunter-gatherer levels (of about150-300 individuals) in more stable, sedentary communities before this feedback mechanism was able to sustainably get off the ground. Before say 40,000 years ago, although there are signs of this feedback mechanism beginning to operate, due to fluctuating and unstable populations the material signature tended to remain fragile. Barrett gives a good summary of some of these issues.
Barrett, J.C (2013).The Archaeology of Mind: It's Not What You Think. Cambridge ArchaeologicalJournal, 23, pp 117 doi:10.1017/S0959774313000012
Juliet and others,
The attached figure may help to elucidate the previous comments I made regarding population levels, niche construction and re-tuning of neural functions, which comes from:.
Hodgson, D. 2013. Grappling with an enigma: the complexity of human behavior as a multidimensional phenomenon.
Hello
I apologise for coming belatedy to this fascinating and erudite conversation. I offer an omnibus of questions which I hope are well enough informed to justify responses. I have tried to present them in an orderly and compartmentalised way. I have some general points, then some specific queries, more or less in order of postings. My apologies if some of this sounds like a sophmoric footnote to “The Two Cultures” to some here. My sympathies are with 4E (or what I refer to call SEED - Situated, Enactive, Embodied, Distributed - not that it matters much). I follow the work of Lakoff, Johnson, Gallese, Clark, Wheeler (etc), and in anthro and arch, Ingold and Malafouris.
I am in awe of those who make it their business to say anything useful about the prehistory of humans, based such scanty evidence. A colleague made a relevant comparison between GOFAI (good old fashioned AI) and the new(er) machine learning type. He said GOFAI applied sophisticated algorithms to a tiny data sample and machine learning has greater success by applying bone-headedly simple statistical analysis to vast data sets. Sadly, archeology seems necessarily stuck in the GOFAI mode.
I have a modest suggestion based on my reflections on the Denisovan discovery, especially the genomic aspect. It occurs to me that if genomic testing reveals that Denisovans were a distinct group, then there may well have been others. I think archeology is about to be rewritten as we test all available archeological remains in this way. I predict that we will find 10 or 12 distinct(ish) types. And as Melansians in particular carry Denisovan DNA, we will find that we all carry, not only neanderthal but other DNA. We will then be persuaded that H Sapiens was not the victor or subjugating master race but (a)rose through profligate and promiscuous hybridity (paleolithic sex tourism). This would have resounding impact on rhetorics of racial purity or purported racial superiority. This is entirely consistent with the biology of hybrid vigor, and its opposite, exemplifed by the haemophilic buffoons of European royalty.
This suggestion has bearing on the current discussion as it suggests another dimension to the question. Was the mentally modern human of a product of racial and cultural mixing? Many have seen language as the distinguishingly human ability. But was it the challenge of linguisitic and cultural negotiation that bootstrapped various strains of Homo into cognitively modern status?
WRT to application of neurocognitive science ‘discoveries’ to (ie) archeology: across the humanities and social sciences, one often sees attempts to justify theories with ‘hard’ science. I am no doubt guilty of this myself but as a committed interdisciplinarian and sometime theorist of interdisciplinarity, I am wary of two syndromes. 1. The kid in a candy shop syndrome, where the goods on display seem overwhelmingly persuasive, and 2. unprincipled, but often unconscious, cherry picking without awareness or understanding of the breadth of fields or internal debates. On the other hand, philosophical conjecture can likewise be ignorant of relevant science. A little knowledge is (no doubt) a dangerous thing, but in interdisciplinary fields we all have only a little knowledge.
Scientific research agendas are often (sometimes wontonly) theoretically thin, as if an experiment was not an attempt to prove an hypothesis. That is, experimental agendas can be directed at philosophically dubious (imho) goals. For instance, much research in cog sci was an attempt to confirm (now outdated) ideas of cognitivist/computationalist paradigms of mind, which do not accommodate ‘external’ or embodied qualities in cognition.
@ Caldararo
WRT the human-exceptionalist thread in this discussion: this matter is explored by Daniel Dennett in a rich and balanced way in his new book From Bacteria to Bach, which I would recommend to all.
@Hiscock
“I would reinforce the points made by Trevor”
Who is Trevor?
@ Harrod. WRT to remarks about ‘art’: Based on my career in and around the modern art (and art history and theory) world, I query any assumptions about art, and particularly about historical continuity of arts practices and their cultural purposes. While we, in late modern capitalism, separate ‘art’ from other activities, I contest that this happened even in early modern european cultures, at least in the same way, and therefore, it seems to me, making any such assumptions about historically or culturally more different cases is extremely dubious.
@ Hiscock “Even if skilled behaviour is not driven by explicit representation in semantic memory, we still need a positive account of implicit representation and its interaction with explicit information …”
This statement, imho, well captures the contemporary crisis of cognitive science. I concur with your reservations about “explicit representation”. We have very little idea what this might mean on a neurological level. Many contest the existence internal representation. As Rodney Brooks observed long ago, ‘representation’ is a hazy idea which serves as a rhetorical device to coordinate the work of researchers whom otherwise have little in common. I prefer to theorise skilled behavior without recourse to such insubstantial and unsubstantiated concepts as mental representation (while of course keeping a vigilant weather-eye on emerging science :). If mental representation is chimerical, any explanation of skilled behavior in terms of it is like demanding an account of physics in terms of ether or chemistry in terms of phlogiston.
So what is “implicit representation”? And for that matter, what, experientially or neurologically, is “explicit information”? I am very cautious about terminology which has been highly inflected by computational discourses. Just what do you mean by ‘information’ in this context?
@ Hiscock “There is no reason at all to suppose that (for instance) norms of teaching and information sharing, or the availability of material supports for cognition, are roughly constant across genetically similar communities that are nonetheless widely separated in space and time”.
I concur, we experience the the profundity of material supports every time we are rendered stupid when we lose our phone.
@ Hodgson. “…. the realization of a more complex brain has led to a need for a longer developmental trajectory in humans than in other primates (neoteny) due to the need for the brain to absorb information from socio-cultural input. In this respect, human children experience a much longer period of vulnerability compared to our closest relatives with the brain not fully matured until the end of adolescence or early adulthood.”
Its not simply a matter of time needed to “absorb information from socio-cultural input”. Parts of the brain simply grow late, even later than the occurrence of puberty. I remain amused by the fact that the prefrontal cortex (presumed responsible for empathy, social responsibility etc) does not mature until early adulthood - and in some cases, not even then. This has helped me understand my students, teenage children and adolescence in general :) They’re not socially disabled, just pre-enabled.
@ Hodgson. “We might make some progress here by utilising the concept of the extended mind/brain whereby, through embodied processes, the mind, through social engagement in humanly made material objects, became a distributed commodity.”
While I agree in general, we must also recognise the incompatibilities between various postcognitivist positions. You seem to be emphasising, as an archeologist might, the importance of exograms (Donald). But the (original) extended mind of Clark and Chalmers is explicitly individualist and disallows sociality (later contested and elaborated by many (Menary, Sutton et al). But in both cases the extensions can be characterised as external (passive) information repositories (again reflecting computational paradigms). I am more interested in dynamical or enactive phenomena which are more or less intractable according to cognitivist/computational approaches.
I concur that mind not only became, but is a ‘distributed commodity’. But I query the exclusive emphasis on “humanly made material objects” . | am deeply invested in tools use, and as such I recognise not simply the isomorphism of tool and technique, but the historically developmental triple helix of tool, technique and application. Violin music, for instance, has developed to its current exquisite form by the mutually catalytic driving of increasingly sophisticated artisanal craftsmanship, musical composition and playing technique.
@ Hodgsdon. “By changing the material parameters, so the human brain, thanks to its inherent plasticity, was able to reformat the neural systems for purposes other than originally devised.”
I object to the use of ‘reformat’, for reasons discussed above, but generally concur. I’m not so persuaded by your citation of what appear to be minor variations in brain structure. As Elizabeth Bates wisely observed (consistent with Gallese’s neural exploitation hypothesis), language is a new machine made of old parts.
@ Hodgson
I query some of the assumptions in your 2013 diagram. In particular, the notion of disembodied cognitive representations. I assume you justify this in you paper - which I have not yet read. I also contest your implicit endorsement of the now very conventional representation of ’feedback from environment’ as a dashed line,as if this aspect of cognition is, as cognitivists would have it, somehow ‘peripheral’. On the contrary, I would argue it is utterly central. Your diagramatic representation puts the machinations of the internal brain center stage, (as such representations almost always do), and thus undermines your protestations of mind as (socially) distributed.
I challenge you to redraw your diagram to defend, rather than undermine, your assertion that “the mind, through social engagement in humanly made material objects, became a distributed commodity”.
respectfully,
SP
Simon,
Thanks for your perceptive comments. First, I agree with you that the human brain does not become fully mature until well after puberty (some say the prefrontal cortex is not fully mature until after 30 years of age). This corroborates my point regarding neoteny and the importance of culture for the fruition of a fully functioning human cognitive system . In evolutionary terms, some even say humans never fully mature as they retain childlike ape features (such as a flat face) even when fully adult, which helps facilitate a more playful enquiring brain compared to nonhuman primates.
Your statement: “But the (original) extended mind of Clark and Chalmers is explicitly individualist and disallows sociality (later contested and elaborated by many (Menary, Sutton et al)” .
There is an ongoing debate here regarding whether the onus is on the individual or social in human evolution. In truth, it may be both. Richerson and Boyd, and many others have shown how social interaction and material culture interact to produce cumulative knowledge, which can lead to beneficial outcomes for the group. Moreover, Robin Dunbar has generated evidence for the human brain being mainly social (the social brain hypothesis) that suggests social interaction is paramount thanks to a prodigious theory of mind that leads to the capacity to empathise. Pertinently, a recent study found that the “default” (resting) state of the brain continues to be mainly about the social aspects of cognition.
With regard to computational processes, archaeology is gradually distancing itself from the idea that the brain can be seen as similar to a computer with the hardware as the neural circuits and the software being the culture slotted into these circuits that leads to the ability to employ “symbols” ( a view stemming from Descartes idea of the separation of the physical brain from “mind”). This is why I mention embodied/enactive processes as a way of understanding how the human brain is extended into the ongoing materially engaged cultural affordances that humans surround themselves with (these affordances are instantiated in the brain especially with regard to mirror neurons, especially bimodal and canonical neurons as part of the dorsal and ventro-dorsal stream running through the intraparietal area through to the ventral premotor area with anterior supramarginal gyrus also being implicated). I, however, do not see this as a passive process, as you suggest, but as dynamic, flexible, and malleable as typified by the capacity of the brain to undergo reformulation i.e., the ability to override certain long standing evolutionary instantiated cognitive modules as a result of cultural input.
With regard to the mind as a “distributed commodity”, I was not implying that artefacts such as tools are passively conceived, in fact, just the opposite, as just outlined. Archaeology together with neuroscience has recently looked in great detail at “tool, technique and application” to assess which parts of the brain are active when a person makes basic stone tools. A particular circuit is implicated (see Shelby Putt 2017 as illustrated in the article at: https://theconversation.com/brain-imaging-modern-people-making-stone-age-tools-hints-at-evolution-of-human-intelligence-77231 ), which is the same circuit that operates in musicians learning to play the piano!! (Bangert et al 2006. Shared networks for auditory and motor processing in professional pianists: Evidence from fMRI conjunction. Neuroimage. 30 (3): 917-926). These two studies underline the way evolutionary instantiated domains can be exapted for purposes to which they were not originally “designed”.
Such studies also dovetail well with your comments about the dynamic processes involved in learning to play the violin.
When I refer to reformatting of the brain, this was not meant in the computer sense, but in a more generalised sense where the brain can reconstitute itself in response to feedback from cultural parameters, which can be minor or major depending on circumstances.
You object to my use of the dashed line in the uploaded figure. I have to say that the dashed line does not imply a value judgement in downgrading the external feedback mechanism but is merely employed to accentuate the processes that take place exteriorly to the brain. The concept of “disembodied cognitive representations” refers to the fact that some aspects of cognition may well not be completely explained by enactive processes. This is where we come to the complexities of Charles Sanders Peirce’s “representational” semiotic theory that are much too convoluted to go into here. For those interested, I go into embodied and enactive accounts of human psychology in an the following article: Hodgson, D. 2013. Grappling with an enigma: the complexity of human behavior as a multidimensional phenomenon. In, The Psychology of Human Behavior R. G. Bednarik (ed.). Nova Science Publishers: New York,
Here are one or two passages from that paper alluding to enactive cognition and “disembodied cognitive representations” :
The concept of embodied cognition, however, requires a more nuanced reading not least in order to avoid the concept becoming redundant. Indeed, there is presently no common acceptance as to what exactly is meant be embodied cognition (Ziemke & Frank 2008; Goschler 2005). The following statement encapsulates the problem.
It is of course possible to treat every kind of behavior as the interaction of a body in an environment. Thus, every experience we make could be called “embodied”. But this would make the notion of the body trivial and we were better off with just calling it “experience” and nothing else. If one doesn’t want the notion of embodiment to be a trivial one, which would lead to a non-falsifiable theory, “body” needs a narrower definition. (Goschler 2005).
In order to address this problem, it is necessary to consider what neuroscience has to say about the brain. Although neuroscience has led to an improved understanding of brain function that has given rise to sub-disciplines such as neuroarchaeology, such insights need to be considered in terms of how the brain of humans has been recalibrated through socio-cultural feedback loops. Such recalibration has allowed the world to be interpreted and reinterpreted thanks to a cognitive capacity that can lead to the countermanding of certain long-standing embodied constraints (Dehaene & Cohen 2007). The interaction between embodied components and more abstract cognitive representations is, however, complex in that, depending on circumstances, embodied factors may preside whereas, in a different setting, more abstract cognitive domains may dominate. This reflects the difference between evolutionary defined functions and socio/cultural cognitive influences. It seems therefore inappropriate to regard human behavior and its effects as determined solely by embodiment, as some archaeologists and anthropologists have tended to (see, for example, Coward & Gamble 2008; Malafouris, 2010a) usually following Gallese (2005, 2007) and associates. Nor should behavior be regarded simply as dependent on higher-order disembodied cognition, as embodied and disembodied systems can function in parallel or separately depending on the prevailing situation (Mahon & Caramazza 2008; Meteyard 2010; Dove 2009).
….
In view of these observations, some researchers propose that representational and embodied approaches should be regarded as complementary rather than opposed (Mahon & Caramazza 2008; Meteyard 2010; Dove 2009). The fully embodied radicalism put forward by authorities, such as Lakoff & Nuñez (1999), has, therefore, been criticized as all cognition is regarded as embodied including language and mathematics. Seen from a broader more complementary perspective, embodied cognition can, however, be viewed as specifically linked to on-line bodily processes i.e., somatosensory and visuospatial/motor abilities and basic interactions of the mirror system with social responses – an approach which has been referred to as “material embodiment” (Ziemke & Frank 2008). In contrast, the off-line conceptual/semantic aspects of higher thinking, visual memory, propositional reasoning, language, and where inferences regarding social information are made over time, seem to have more in common with disembodied cognition. Take, for example, the following statement:
…cognition entails the manipulation of explicit representations of the state and behavior of the external world to facilitate appropriate, adaptive, anticipatory, and effective interaction, and the storage of the knowledge gained from this experience to reason even more effectively in the future. Reasoning itself is symbolic: a procedural process whereby explicit representations of an external world are manipulated to infer likely changes in the configuration of the world arising from causal actions. Vernon (2008).
In this sense, semantics would not be possible in the absence of the capacity for context independent representations and symbolism in some form. Hence, a more interactive process, referred to as secondary/weak embodiment (Meteyard 2010), where elements of representation and embodiment exist alongside one another, seems to provide a more realistic way of understanding the dynamics of cognition. The application of embodiment to all areas of cognition has also been criticized because it is unable to account for many abilities that are better accommodated by disembodied cognition (see Adams & Aizawa 2010; Chatterjee 2010; Mahon & Caramazza 2008; Zlatev 2008; Tversky & Hard 2009). For example, cognition that marks out humans may rely on recently evolved forward areas of the brain that operate as part of a multi-modal representation system for coordinating complex information (Barsalou 2008; Pessullo 2011) – what Halford et al. (2010) refer to as higher cognitive relational knowledge. It has also been established that social attribution recruits the medial prefrontal cortex and is involved in the meta-representation of complex social inferences (Van Overwalle 2009). So, although embodied processes may play an important role in cognition, “amodal” representations continue to be relevant by serving a unitary function (Barsalou 2008). In other words, and as Figure 1 illustrates, embodied attributes may operate at the lower to middle end of a hierarchy from which more abstract and flexible modes are constructed that may sometimes constrain thinking but at other times abstract conceptual domains can lead to a release from such limitations (Goldman & de Vignemont 2009;Ikegami & Zlatev 2007).
Hope some of these comments clarify the various issues raised.
Derek, your response is awesome. I see I have a lot of references to read to catch up to it.
Meanwhile, I respond to Simon Penny's comment on art history and contemporary art. If you were to read my research findings on researchgate and academia.edu, especially the posted slides from my course on 2 million years of art in human evolution (Maine College of Art) you may (or may not) agree that 'design' (a word I now find much better than 'art') or 'palaeoart' (the term used by Robert Bednarik) may well have a much longer history than what is taught in what seems to be every art history department in the US and elsewhere. (I suggest you read up on Bednarik and everyone else who works in the field of rock art studies.) Typically all the faculty and courses focus on Ancient to more recent 'art', and in some overview courses, one class is given to the cave art of Europe, and may be a few minutes on indigenous art in the rest of the world. Such an approach to art history is totally obsolete and exemplifies Euro-centric colonialism (as is late modern capitalism, or at least the term 'late modern capitalism'). That the term 'art' seems to be 'modern' and doesn't apply to paleolithic art is accepted consensus, see e.g., Randall White who has eloquently addressed this. In my MECA course I use the term 'art' because I am engaging artists. They had no problem with the usage. Nor did students focused on art history.
Your concluding remark "making any such assumptions about historically or culturally more different cases is extremely dubious" implies one should not do research on the origins of art or graphic design in human evolution. It might not be so dubious if you entertained the now extensive research on the topic.
touche, Derek, i am looking at your Figure 1 right now and it seems that you don't really need arrowheads on the left side of the lowest line (labelled Early Evolution on the left side) ...because we aren't evolving "backwards" ...at least I hope we are not. Please correct me if there is another interpretation I am missing.
Juliet,
The arrowhead you cite refers not to going back in evolutionary time rather this refers to the fact that, in contrast to the association cortex, the sensory areas tend to become structured very early on in infancy e.g. primary visual cortex. These areas are vital for encoding the invariances coming from the environment that quickly become automatic and rapid and so would have been a common component of the early hominin brain (though there seems to have been a modicum of reorganization of some parts of these areas when modern humans are compared to nonhuman primates).
Derek,
I just now looked at your Figure 1. The evo timeline 'early' to 'recent', with their two descriptive boxes, might be so but I doubt it. What is the evidence in paleoanthropology for how you assign the descriptors? Seems to me the evidence points to both brain functions emerging together, sparse data for this in Mode I Oldowan, stronger support for this in Mode II Acheulian. If there is a difference between 'early and 'recent' phases of evolution of 'mind' would it not be 'boxed' as the difference between what Piaget called 'concrete operations' and 'formal operations'? If these emerge sequentially in ontogeny, seems both by parallel hypothesis as well as some archaeological evidence, they sequentially emerged in phylogeny?
Peter, I am not into the cognitivist debate, I come from a different philosophical perspective, but thanks for your reply. I agree with much of what you say in your reply. It propelled me to look up 4E in Stanford Encyl of Phil and Wikipedia. As a layperson on the margins I find each of the 4Es acceptable. No problem with any of them. But as an outsider I do venture a very tentative comment from the perspective of phenomenology and ontology: the various positions in the cognitivist debate appear to me to be hindered by some substantial category mistakes, confusions of terminology between different ontological regions. In my project on ontology in progress, I differentiate six distinct ontological regions (regions of 'what is', ta onta) as well as two 'pre-ontological regions'. Each I suggest correlates to a distinctively different neural substrate. I call them 'theory of mind' (ToM, sensu lato) networks. IMHO seems to me the various cognitive models do not differentiate these regions sufficiently. Attached is a table summarizing my differentiations and correlations. The latter are tentative. My quick read of summaries of 4E is that authors conflate radically different concepts of 'agency' (principle of scientific causality), self, life and ego-cognizer from four different ontological realms. A cognitivist may respond they are using these terms loosely or in an everyday ordinary language manner. I see that as confirmation of the challenge.
Juliet, Simon, and James,
In order to further clarify the model outlined in the diagram, Singer (2006) found that with regard to the emotional brain (processed in limbic and para-limbic structures)and the social brain (processed in pre-frontal [mPFC] and posterior STS [and associated areas) that the former develops early in phylogeny whereas the latter develops later in phylogeny. Thus, understanding the mental states of others develops later in ontogeny than the ability to share emotions. This is due to the fact sharing affective states is dependent on phylogenetically older areas that develop early in ontogeny whereas, in contrast, understanding the beliefs and thought of others depends on brain regions that evolved later in phylogeny and mature the latest in ontogeny. Correspondingly, higher-order association areas have been found to mature subsequent to lower-order sensory areas having matured. In effect, phylogenetically older areas appear to mature before newer cortical areas. For example, in the frontal cortex, primary sensorimotor cortices mature first but the pre-frontal area matures much later.
Singer 2006. The neuronal basis and ontogeny of empathy and mind reading: Review of literature and implications for future research. Neuroscience & Biobehavioral Reviews. 30 (6): 855-863.
Re: Singer. I finally read this. it is a 2006 paper, when mirror neurons and cognitive theory of mind were generalized in many research papers covering a multitude to topics. (1) Frith proposed that cognitive theory of mind arose out of the neural network for biomotion/animacy. That was a generalization, which now seems false. Latest research seems to say that there are a set of different ('dissociated') neural networks for social cognition, theory of mind, affective empathy, moral sensitivity, biomotion/animacy, agency, etc. They can no longer be lumped together in total nor in part. If there is one node in each of these different networks that overlaps in a particular task this requires specific explanation.
(2) Mirror neuron motor hypothesis for 'explaining' language is not fully confirmed in several imagining studies, e.g., see papers by Grodzinsky and Santi, 'battle over Broca's area', etc. Mesgarani et al validating Jacobson and Trubetzkoy phonology model using structural differential features, Ding et al 2016 validating Chomsky, and Santi 2007 validating Chomsky. Depth structure, a term out of fashion for a couple decades is now back as the pendulum swings in their favor.
(3) For update on empathy network, I recommend the review Gonzalez-Liencres, Shamay-Tsoory and Brüne 2013. This study also indicates some problems with the mirror neuron hypothesis. I am working on a paper on Animacy network, and most recent imaging studies argue that the network is distinct from the other networks I note in (1).
An engaging discussion.
To my mind we might benefit from viewing modernity as a continuum rather than an absolute. We are not the fastest, the biggest or the most robust, but our modernity (as a "new" species) equipped us with much to enable elite hunting and defence.
How might such a continuum of modernity look? (as identified above) Humans, the "new" species, have the increased cranial capacity that, over time overcomes our inherent lack of speed, small teeth, poor climbing and jumping skills, etc. Our modernity would appear inherent with the emergence of the very first human. How we assess or see this is the problem to be overcome, just as how we have developed labels for and come to assess / see (for instance) pre-Platonism, pre-Neolithic, pre-industrial revolution, etc.
The question might be: is it possible a non-modern human has ever existed? Does the lack of Copernican or Galilean ways of seeing / comprehending in pre-Copernican or pre-Galilean societies make these cultures any less modern? If so, how? These questions seem as pertinent to early human prehistory as to discussions of modern philosophy / natural science.
Consequently, the question might be, when / how did humans begin to see themselves as something other than (simply) another animal. This Gilgamesh dichotomy likely existed far beyond the Sumerian times; the sense of something lost in becoming modern; the need to offer thanks to that which provides food / water.
Complexity even if the mechanism is there takes time to accumulate. At the same time junding complexity by the preservaon of cultural remains is as weall know difficult.
To David Burke,
Indeed, modernity is a relative term that depends on from where one views the issue. Perhaps the term should be dropped as it comes with too much conceptually loaded baggage. Nevertheless, it has to be said there are differences especially in relation to the effects of cumulative culture. Richerson and Boyd and collaborators have shown that so called "modern" humans, because they are over dependent on specialists and complex cumulative technology are unable to survive when, for one reason or another, they are obliged to survive in a "naturally" disposed environment. In contrast, in such an environment, hunter-gatherers, have the benefit of particular technologies and a culture that is adapted to that environment and are able to prosper.
"Modern humans" is used in three different senses to my knowledge. Paleoanthropologists speak of Anatomical Moderns for the species or subspecies that evolved in Africa about 200 kya. The early AMH have modern looking skulls and the post-crania are less robust than earlier Archaic Humans, tho they are still robust compared to later H. sapiens.
Anthropologists talk about Behavioral Moderns, exemplified by the Upper Paleolithic folks of W Eurasia after 40 kya. These are people with art and fancy stone tools and seem to have been able to do anything that ethnographic hunter-gathers can do. Renfrew's sapient paradox calls attention to the apparent gap between the first AMH and the evolution of BMH. There are hints of behavioral modernity in the African record in the gap but the whole package represented by the UP has not turned up so far. But the African record is still ill explored.
Finally, sociologists speak of Modern Societies as those that evolved (and are still evolving) in the wake of the industrial revolution that began a couple or three centuries ago.
As with any periodization, it is a crude descriptive tool. Evolutionary processes are generally incremental especially if you have highly time-resolved data. On the other hand, rates of evolution can be quite fast so if time resolution is poor the record will look punctuational. Evolutionary biologists try to find cases where they can get high resolution data to test their theories. See the Ezard ref below.
Ezard, T. H. G., Aze, T., Pearson, P. N., & Purvis, A. (2011). Interplay Between Changing Climate and Species’ Ecology Drives Macroevolutionary Dynamics. Science, 332(6027), 349-351. doi:10.1126/science.120306
Hi Derek,
I am still not persuaded by the 4E reference. You say:
" We might make some progress here by utilising the concept of the extended mind/brain whereby, through embodied processes, the mind, through social engagement in humanly made material objects, became a distributed commodity. By this I mean that materials became a shared human commodity available to the community as a whole. "
I continue to find the language obscuring the reality. The mind is not being distributed, the objects are. Yes the objects are a cognitive resource and you are right that they can be shared in interesting ways (ie you dont have to meet someone you can leave the object for them), but they are not minds.
Peter
Hi Derek,
Complex cooperative behaviour seems a defining characteristic of humanity from earliest humans.
Perhaps names and labels for an Archaeology of Cooperative Behaviour in early humans, might allow us to see the data / problem in new and informative ways.
what would such an archaeology look like? What terminology would develop? Does it let us see the data differently? The think it does.
making and ownership of tools requires cooperation, if jealously and greed are to be thwarted.
i agree with much of your thesis, but think we need observe the data from new / different angles to answer the questions you ask.
The premise of cooperative behaviour(s) may be one useful example .......
To, Peter Hiscock and Peter Richerson
I agree that artefacts are in themselves not “minds” but note that we allude to “extended minds” that implies something about the mind is imbued in cultural objects. Thus, objects are instilled with affordances as a function of canonical neurones that are found in the visuo-motor dorsal stream. Similarly, mirror neurones in the human brain seem to underlie the affordances that facilitate imitation. Such neural circuits may well drive the 4Es (extended, embodied, embedded, enactive cognition). However, as a soft rather than a hard embodied theorist (see my comment and diagram on an earlier part of this thread), I would subscribe to the idea that 4E cognitive processes can act in concert, alongside or even become somewhat detached from embodied components. Such detachment may be underwritten by the semantic aspects of cognition found in the ventral stream and language systems where, according to circumstances, they can be directed by embodied systems or alternatively act to guide enactive processes or, in some circumstances, function as relatively independent entities. The concept of cognitive “bandwith” in your Biological Theory paper is pertinent here as this chimes with my 2013 Biological Theory paper where I mention the importance and degree of “flexibility” in human cognition where, as much as some types of cultural information may take time to accumulate for that very reason can easily be lost should population levels decrease or environmental/ecological factors change. Cecilia Heyes and Caroline Catmur’s research on cognition is particularly relevant to this debate. This may, to some extent, account for the fluctuating archaeological signal from 300,000 years onwards. It seems that the anatomical human brain may always have had the potential for complex material culture but this was unable to accumulate until the population had reached a more sustainable and stable level beginning around 42,000 years ago. Renfrew’s sapient paradox may therefore not be as paradoxical as it seems!
Peter Richerson’s point regarding later humans being less robust than earlier forms may be a result of the growing influence of cumulative culture that increasingly served as a buffer to “raw” evolutionary forces (Hodgson 2000). Thus, humans from 300,000 years ago steadily domesticated themselves; a process that became even more pronounced with the appearance of complex cultural artefacts from 42,000 years onwards when overall brain size continued to decrease. The way that cumulative culture is fed back through niche construction to restructure the human brain seems to be crucial in determining the behavioural profile at any one time.
Catmur C, Walsh V, Heyes CM (2007) Sensorimotor learning configures the human mirror system. Curr Biol 17:1527–1531
Catmur C, Walsh V, Heyes CM (2009) Associative sequence learning: the role of experience in the development of imitation and the mirror system. Philos T R Soc B 364:2369–2380
Heyes CM (2003) Four routes of cognitive evolution. Psychol Rev 110:713–727
Heyes CM (2009) Evolution, development and intentional control of imitation. Philos T R Soc B 364:2293–2298
Heyes C (2010) Where do mirror neurons come from? Neurosci Biobehav R 3:575–583 Heyes CM (2011a) What’s social about social learning? J Comp Psychol. doi:10.1037/a0025180
Heyes CM (2011b) Cultural inheritance of cultural learning. In: Workshop presented at University of Oxford. New thinking: advances in the study of human cognitive evolution. http://media. podcasts.ox.ac.uk/socanth/ehc2011/ehc00_intro.mp3
Hodgson. D. (2000). Art, Perception and Information Processing: An Evolutionary Perspective. Rock Art Research. 17 (1): 3-34. http://www.ifrao.com/art-and-perception/
Hodgson, D. (2013). Cognitive Evolution, Population, Transmission, and Material Culture. Biological Theory. 7: 237-246. http://link.springer.com/article/10.1007/s13752-012-0074-y
It may be the case that niche construction is a component of neo-Darwinian evolution. But doesn't every biological species or quasispecies have a niche? If so, how specify the specific niches for the dozen plus primate species and how each niche was part of the extended cognitive apparatus? For example, what is the chimpanzee niche? The resource landscape? A bed nest build high in a tree? Same goes for the common ancestor of humans and chimpanzees. Talk of niches without detailed specification of a hypothesis for a species' niche is untestable speculation.
Re: language and mirror neurons. I think the latest research by Grodzinsky et al is convincing that mirror neurons cannot explain the origin of language or the structure of language. Mirror neurons may explain imitation, but not the cultural behavior or production that is imitated?
To James Harrod,
The difference between humans and other biological species seems to be the accumulated niche constructed by the former, as opposed to the latter, that is cumulative across generations..
Renfrew’s sapient paradox is a misperception resulting from (1) the assumption that what makes H. sapiens human is “sapience” and that this is the main reason for our large brains, (2) ignorance of anthropology and (3) selective inattention to archaeology. The Neolithic revolution – the focus of Renfrew’s paradox – was not so much a “cognitive accomplishment” as a political imposition. No one in their right mind would switch from foraging to agriculture unless someone pointed a metaphorical gun at their head. Even today, foragers, having seen the blessings of civilization, prefer to carry on hunting and gathering. They enjoy many more hours of leisure than we do, and in any case foraging is not “work” – all motile animals forage instinctively and civilized people hunt and gather just for the fun of it. The foraging lifestyle – the one to which our evolution has best adapted us – is probably the happiest.
The archaeological evidence shows that, for most people, the Neolithic revolution severely eroded the quality of life. Bodies got smaller due to malnutrition, and common health problems included vitamin deficiency diseases such as rickets, repetitive strain injuries due to the tedious work of grinding cereals, and dental damage caused by the resulting grit in food. Someone must have benefited, and this implies the collapse of any prior egalitarian social order and a hierarchic system allowing a privileged elite to exploit the labour of others. A parallel could be drawn with the industrial revolution which was equally political in origin. Without a population disenfranchised by land clearance, there would have been no “labour force” to operate the machines. For most people, the movement from farm to factory was no more voluntary than that from foraging to farming.
Note that, following the agricultural revolution, brains as well as bodies got smaller. There were similar anatomical changes in foragers as agriculturalists drove them into the most marginal environments. Even so, foragers today have higher brain to body mass ratios than post-agricultural people. Perhaps social hierarchy – with control imposed from the “outside” – reduces selection pressure for large expensive brains to maintain human cooperation and social cohesion.
Renfrew’s cognitive bias is apparent in the way he homes in on the Neolithic as the watershed transition of anatomically to behaviourally modern humans. The glorious flowering of Upper Palaeolithic art – including supernatural entities such as the lion-headed man from Hohlenstein-Stadel, other objects of ceremonial or ritual significance, and some of the earliest musical instruments – does not seem to strike him as sufficiently “sapient”. Upper Palaeolithic art persisted from around 40,000 to 12,000 years ago, ending with the final phase of the last glacial period. That leaves a window of around 2000 years in which climate change presumably favoured the social changes required for the agricultural revolution.
Behavioural modernity is undoubtedly older than the Upper Palaeolithic. Sheila Coulson has discovered evidence of python ritual in Rhino Cave, Botswana, more than 70 thousand years ago. Ian Watts finds evidence of ritualized use of ochre around half a million years old, and of ritual becoming an evolutionarily stable strategy between 150 and 200 thousand years ago. The Berekhat Ram “Venus” – which I think is more likely to be a child’s toy than a “work of art” – is at least 230 thousand years old.
In 2009, I reviewed The Sapient Mind, edited by Colin Renfrew, Chris Frith, and
Lambros Malafouris – the outcome of a 2007 symposium in which Renfrew proposed his paradox. The symposium (and book) aimed to further our understanding of the origins of the modern human mind. But, as the editors note in their introduction to the book, it is “extremely difficult” to specify exactly what it is that makes the human mind unique. So these authors aimed to explain something even though they didn’t know what it was.
Curiously, in the same year, I co-authored a paper with Chris Frith et al. on the neural correlates of observing pretend play (Whitehead, Marchant, Craik & Frith , 2009, SCAN, 4, 369–78). We published in SCAN because Chris did not think it of sufficient theoretical interest for Nature. I am sure the editors of that prestigious journal would have agreed. This strikes to the heart of what is wrong with cognitive science. Play has deep biological roots. Many mammals engage in embodied social play including rats, social carnivores, and primates. Evidence for pretend play in chimps is scant, but from around 12 months human children spend much of their day pretending. This doesn’t even stop for meals – hence the popularity of jelly babies, gingerbread men, animal crackers, etc. Using toys to represent real things paves the way for role-play – using yourself to represent someone or something else – around the time of the “terrible twos”. The child is becoming more self- and other-aware. Imaging studies show that these activities recruit brain structures that were most expanded during human evolution, including those implicated in “theory of mind” and the so-called “default network”. There can be little doubt that these costly and sophisticated human adaptations are important for the development of the “modern human mind”.
When I first proposed an fMRI study of pretend play to Chris, in July 2005, I pointed out to him that the brain must be a doing organ before it can be a thinking organ. This is true phylogenetically (“thought” could not evolve in an animal that could not act) and ontogenetically (the foetal brain puts out efferents to muscles before it receives afferents from sense organs). Chris agreed. Eleven years later (2016) he co-authored with Thomas Metzinger a chapter in The Pragmatic Turn, a book exploring the view that cognition does not produce models of the world but rather subserves action. This is hardly the paradigm shift claimed by the editors. Similar views were expressed in the late 19th century by the American pragmatists Charles Peirce, William James, and John Dewey (reviewed in another book with the same title: The Pragmatic Turn by Richard J. Bernstein, 2010).
What particularly strikes me about this latest action-oriented book is that, even in the chapters on cognitive development, the authors do not address the real actions performed by real people in the real world – notably the things that children do spontaneously whilst growing up, when the self-sculpting brain is developing its adult structure and functionality. Rather, the research focus is on laboratory tasks devised to test “cognitive functions”. They do not seem interested in the function of contingent mirror play, which can begin within minutes of birth, or song-and-dance displays at three months, indexical gestures at six months, mark-making behaviour at nine months, iconic gesture-calls and pretend play at twelve months, and so on. They are interested in language (because it’s “sapient”) but not in mimesis, music, and the formidable armamentarium of signals and displays that distinguish our species and define us as human. Authors instead describe infant movement as “chaotic” and focus on motor control, grasping, efference copy, and “sensorimotor contingencies”. They assume that action is primarily goal directed. There is not one reference to childhood play, which has function but no goal – being autotelic or self-motivated, and pursued “just for fun”.
At least The Pragmatic Turn is a healthy step beyond the input-processing-output model, first criticised as incomplete by John Dewey in 1896. Cognitive science has advanced from the computer model to a robotics model. I am not denying that cognitive science has made major discoveries and advances, but as science it is severely blinkered. Cognitivism, I believe, is linked to the physicalist ideology that began with the Enlightenment. There are some (like Metzinger) who seem to celebrate a perverse aesthetic that regards anything that dehumanizes humans as “scientific”.
Relevant publications can be downloaded at www.socialmirrors.org. Go to “About Charles Whitehead”.
James Harrod - when I get a breather I'll look at your work - looks very interesting.
Dear Derek and all
I apologise for my absence since my first and last post 'a month ago'. I subsequently had extended difficulties with RG display in general, which among other things prevented me from getting back on this thread.
FYI, it appears that the current version of Safari does not display RG accurately. That or my adblockers ghostery and ublock were getting in the way. In either case, RG tech support were not helpful.
I was also in the run-up to the An Ocean of Knowledge: Pacific Seafaring Traditions, Sustainability and Cultural Survival conference which I ran last week (see poster attached) - it was a huge success.
I'll respond to a couple of recent posts in following postings
SP.
@ Derek (a month ago)
> There is an ongoing debate here regarding whether the onus is on the individual or social in human evolution. In truth, it may be both.
I think without question it cannot be other than 'both'. I think the anglo-american humanistic overemphasis on the individual clouds our vision. Perhaps Descartes could only say 'I' think therefore 'I' am, because till then we'd said "we think because we are" ?
> ... archaeology is gradually distancing itself from the idea that the brain can be seen as similar to a computer with the hardware as the neural circuits and the software being the culture slotted into these circuits that leads to the ability to employ “symbols”
TGFT (thank god for that), its about time!
> This is why I mention embodied/enactive processes as a way of understanding how the human brain is extended into the ongoing materially engaged cultural affordances that humans surround themselves with
I too endorse such paradigms. I ran a conference on the subject last year (A Body of Knowledge - Embodied Cognition and the Arts - sites.uci.edu/bok2016) and my forthcoming book 'Making Sense' (MIT press) draws heavily on those paradigms.
> these affordances are instantiated in the brain
Here I have to quibble with you (as I quibble with Dennett in his use of the term in his new book From bacteria to Bach). I'm not sure its valid to speak of affordances being instantiated in the brain. According to Gibson, a 'hard externalist', affordances are in the world. I respect Gibson's position, as difficult as it is. As I understand it, if we speak of affordances as being *in the brain*, we are speaking of something quite different from Gibson's idea.
> Archaeology together with neuroscience has recently looked in great detail at “tool, technique and application” to assess which parts of the brain are active when a person makes basic stone tools. A particular circuit is implicated ...which is the same circuit that operates in musicians learning to play the piano!!
I'm not surprise in the slightest. But I do quibble with the idea of it being just one 'circuit'. And I quibble with the use of the word *circuit* in neuroscience - its just an earlier generation of metaphors borrowed from electrical engineering. Metaphors are great, until they're not.
> These two studies underline the way evolutionary instantiated domains can be exapted for purposes to which they were not originally “designed”.
Vittorio Gallese has a term for this, and Elizabeth Bates famously said ~language is a new machine made of old parts.
> The concept of “disembodied cognitive representations” refers to the fact that some aspects of cognition may well not be completely explained by enactive processes.
"may well not be completely explained" sound like a bit of a hedge. And I agree, once we embrace the possibility of externalisms of one kind or another - enactivism vs extended mind for instance - we have lots of new questions to answer. In the past, almost all cognitive explanations depended on the idea of mental representation. Now it seems an awful lot of cognition occurs without mental representation, or using transitory and contingent representations derived from immediate manipulation of artifacts.
> But this would make the notion of the body trivial and we were better off with just calling it “experience”
this assumes know what we mean when we use the term 'experience' .This is open for question as Maxine Sheets Johnstone has emphasised.
In general I agree with the 'middle way' you propose in the bolded passage, but I violently disagree with the characterisation " off-line conceptual/semantic aspects of higher thinking, " both 'off line' and 'higher thinking' subscribe to the traditional heirarchical dualisms _as if they were real_.
> Reasoning itself is symbolic
this is clearly not questionable. the question is "how much of cognition is reasoning?" And secondly "what do we mean by reasoning?". Is a cognitive performance which is non-conscious, but is explicable in terms of reasoning actually reasoning?
> embodied attributes may operate at the lower to middle end of a hierarchy
again, I challenge the heirarchical dualism implicit in this characterisation. On what basis do you feel permitted to assign 'embodied' aspects of cognition lower status? I do not necessarily want to invert the heirarchy, but I do want to know a principled basis for retaining it.
sincerely
SP
@ Peter Hiscock
> Yes the objects are a cognitive resource and you are right that they can be shared in interesting ways (ie you dont have to meet someone you can leave the object for them), but they are not minds.
so what is 'mind'? and is mind exclusively the property of an individual brain in and individual body? We (white western male intellectuals) are trained to think in these terms, but can we think outside them? Why do we cling to this hallowed notion of the all surpassing individual? In some societies, the individual only exists as a relation position within a group (family/kin/village/state). In the west, we think of society as constituted by an aglomeration of individuals.
@ James Harrod
> Such an approach to art history is totally obsolete and exemplifies Euro-centric colonialism
I couldn't agree more.
> Your concluding remark "making any such assumptions about historically or culturally more different cases is extremely dubious" implies one should not do research on the origins of art or graphic design in human evolution. It might not be so dubious if you entertained the now extensive research on the topic.
Now now, ad hominem attacks are not appropriate.I think you miss my point, or I did not make it clear. Misunderstandings do occur in this narrow bandwidth. My point concerned not 'pictures', but what we call 'art' which is often not pictures. I have been an artist for over 40 years and little of my production has been pictures. The term 'art' to me implies set of coventions and social networks, and this radically varies in different historical periods and cultures. Yes, people have made marks, pictures, representations, for as long as we've been sapient - in fact I'd say making pictures defines sapience. But I would not confuse the study of the history of "marks, pictures, representations" by deploying the word 'art'.
sincerely
SP
@ Charles Whitehead
thanks for your post, very interesting
> No one in their right mind would switch from foraging to agriculture unless someone pointed a metaphorical gun at their head.
nice. I like this argument, which I first encountered by John Zerzan, 1994.
https://theanarchistlibrary.org/library/john-zerzan-future-primitive.lt.pdf
> in July 2005, I pointed out to him that the brain must be a doing organ before it can be a thinking organ.
right. It is a commonplace in ethological circles. Patently obvious - evolutionarily - the body grew a brain, not the other way around. The body grew a brain to cope with mobility and sensing at a distance.
> Many mammals engage in embodied social play including rats, social carnivores, and primates. Evidence for pretend play in chimps is scant, but from around 12 months human children spend much of their day pretending.
I think pretend play is more broadly present. When my cat bites in play, it is a restrained bite. She's saying "I'm pretending to bite you" Cats have irony.
> There is not one reference to childhood play, which has function but no goal – being autotelic or self-motivated, and pursued “just for fun”.
The whole notion that cognition is instrumental is so simplistic. But play clearly has a 'function'.
> I am not denying that cognitive science has made major discoveries and advances, but as science it is severely blinkered.
yup
(1) Simon Penny et al, I regret that my comment about 'entertaining the now extensive topic' of 'art' (graphic design, palaeoart, curation of pareidolia) from 2-3 million years ago' came across as an ad hominem argument. I guess it is my frustration at how much evidence exists and how little seems to be referenced in humanities and science research on cultural evolution. I too have engaged in creative writing, stone sculpture making, watercolors and now into acrylic painting. Basically, given my elder bias my meager developing style is humbly attempting to play respond to Kandinsky, John Marin and abstract expressionists.
(2) Since the term 'art' is Eurocentric, in my published and online article on evidence for 'art' and 'religious/spiritual behavior' in the Oldowan provides a list of behaviors to be used as criteria, which synthesizes lists from McBrearty and Brooks, termed 'symbolic behavior' (2000); Bednarik (2013, ‘Pleistocene Palaeoart of Africa); Bednarik (2013, ‘Pleistocene Palaeoart of Asia’); Bednarik 2003, ‘Earliest Evidence of Palaeoart’); Bednarik (1995, ‘Concept-Mediated Marking’); IFRAO Rock Art Glossary (Bednarik 2003/2013); and ‘trace-making’, Matthews (2011, 1997, 1994).
(3) Re: frontal lobe and culture evo. Comparing neuroimaging studies by Stout et al on Oldowan and Acheulian toolmaking and the rare several studies on art-making (Solso's preliminary studies, which I reanalyzed using Brain Voyageur Brain Tutor to better identify neural coordinates; couple studies on actual making of drawings, portraits, and on Kanji writing to synthesize in at least one study that might be relevant to 'glyph-making' and 'petroglyph' making), and acknowledging the very limited number of neuroscience studies in contrast to accumulating studies on art appreciation, and books on it, which activates an overlapping but distinct neural network from 'art'-making, I suggest the evidence indicates that art-making activates frontal, frontalpolar, loci as well as far more loci across the frontal-prefrontal-Broca-inferior-parietal-temporal-occipital network than does Oldowan and Acheulian tool-making, which predominantly activates, as one might expect, premotor, motor, superior parietal and occipital cortex loci. Thus, if frontal lobe expansion/reorganization is a factor in cultural evolution, art, or whatever label one wishes to name it, appears to have played a critical role. See my IRAC Albuquerque presentation, slides posted, where I map activation loci onto the Tobias illustration of brain expansion areas from the Oldowan onward.
(4) Re behavioral 'flexibility' as defining characteristic of H. sapiens sapiens versus Neanderthals or other contemporary species/subspecies. See Harold Dibble's counter argument (for instance in Beebe Bahrami, Café Neanderthal, 2017) that at least in Europe Neanderthals had a more 'flexible' response to their environment (with possible exception of fire-making) than H. sapiens sapiens.
(5) Re which species or quasi-species makes 'art'?
I reviewed 750 Paleolithic archaeological sites across the globe, finding 318 reports of symbolic behavior prior to the Upper Paleolithic/Later Stone Age (Table 1a). Though the paper could be revised with publication of more recent discoveries, a totally unexpected finding was that during the Late Middle Paleolithic/Middle Stone Age (i.e., Africa) period 90 reports of evidence for symbolic behavior showed 63 for Neanderthals but only 27 for sapiens sapiens (Table 1b). (Harrod, J. 2010. Four Memes in the Two Million Year Evolution of Symbol, Metaphor and Myth, conference paper, 4th Annual International Association for Comparative Mythology, Harvard University, October 6-8, 2010 https://www.academia.edu/10601547).
It would be a stretch, but for fun and irony with respect to the current archaeological and cultural evo consensus, one might hypothesize that evidence for symbolic behavior (palaeoart) of H. sapiens sapiens arriving in Europe during the Late-MP comprised mostly pigments and adornments (beads, pendants, etc.), while Neanderthal symbolic behavior includes a wider array of symbolic behaviors, including mortuary ritual, stone arrangements, representational imagery (especially stone sculptures or 'figure-stones') as well as pigments and adornments.
To this could be added more recent Neanderthal evidence for non-utilitarian use of raptor feathers and talons. Recent publications indicating that UP microlithic technologies did not diffuse out-of-Africa but arose independently and multi-regionally across Africa, Asia and Europe, and Australia also calls into question the old LSA/UP out-of-Africa (45 kya) hypothesis. Furthermore, given such a state of affairs and the lack of hominin fossil evidence, this brings into question, as Bednarik has argued, whether or not some portion of Early Upper Paleolithic Proto-Aurignacian and Aurignacian cave and portable art was produced by Neanderthals.
@ James Harrod
> Simon Penny et al, I regret that my comment about 'entertaining the now extensive topic' of 'art' (graphic design, palaeoart, curation of pareidolia) from 2-3 million years ago' came across as an ad hominem argument. I guess it is my frustration at how much evidence exists and how little seems to be referenced in humanities and science research on cultural evolution.
explanation of misunderstanding accepted. :)
BTW - did you see the news on recent chinese discoveries:
http://www.sciencemag.org/news/2017/03/ancient-skulls-may-belong-elusive-humans-called-denisovans
Yes, I read Li et al (2017), and it does add to our knowledge of hominins during East Asian Middle Paleolithic, all of which fossils, it seems, have such mosaic features, variously labeled 'H. sapiens sapiens with robust features', or with 'archaic features' or 'archaic Homo sapiens', e.g., Xinglongdong 120-150 ka; Xuchang 105-125 ka; Zhirendong 100-113 ka; Fuyan 80-120 ka; Huanglong 81-101 ka; and Tianyuan 40 ka, with B-mtDNA typical of present day SE Asia (Fu et al 2013), and Neanderthal but no Denisovan (Yang et al 2017). I look forward to genomic analyses of the other MP fossils.
From my palaeoart perspective I note that at Xinglongdong site with a south China pebble industry found two Stegodon tusks from two different individuals placed parallel and incised with straight and curved lines in groups, simple and abstract images (Gao, Huang et al 2004). This appears significantly older than incised artifacts from southern Africa MSA. So far I have seen no reports for palaeoart at the other MP sites listed above.
Aside: With respect to Tianyuan genetics, this supports my hypothetical map for MIS4 R-mtDNA ~74 ka out-of-SW Asia crossing Asia by the 'Northern Route' over the Silk Road, branching ~71 ka R30-31 into South Asia and later branching ~56 ka B into East Asia and SE Asia and ~64 ka P-mtDNA into Australia (Harrod 2014: 70, Fig 3). Article The 200,000-Year Evolution of Homo sapiens sapiens Language ...
What if we miss some important actors and civilizations? Could it be that such actors, not perceived by us, exercised important influence on organic, bioenergetic and even cognitive evolution? Or is this just a silly, unproven assumption?
@ Simon Penny
Sorry for delay in responding. Many thanks for your positive comments and the link to the Zerzan paper which I read with interest. Yes, he says with much richer detail and comprehensive references what I mentioned (based on others' research). My only criticism would be that he has something of an ideological objection to culture generally, including ritual. But for ritual, human societies would be more like ape societies - all human societies have been shaped by a cultural revolution which is anti-nature in some sense. Without their healing dance ritual, Bushmen would not be able to meet god face to face once or twice a week (Katz: Boiling Energy). Durkheim in 1912 argued that you cannot have language prior to ritual - and I have never come across a counter-argument, plausible or otherwise. Whereas animal signals refer to perceptible things in the here-and-now (he argued), language refers to thoughts, memories, and things not here-and-now. How can you encrypt an intangible, he asked, unless you have some kind of conventionalised performance or role-play, where everyone has the same ideas or understandings that can then be referred to, so kick-starting language? This has more recently evolved into ritual-speech co-evolution theory:
http://www.chrisknight.co.uk/ritual-speech_coevolution/
Also “Why Humans and Not Apes: the social preconditions for the emergence of language” in The Social Origins of Language (download at www.socialmirrors.org home page top left).
Speech Act theory points to the same conclusion (you can't have language without a "social contract" with reasonable guarantees of honesty. This requires communal or "sacred" authority established through ritual).
So, though I think foraging may be the happiest way of life, Zerzan didn't quite convince me to haul down the flag of civilization just yet. For one thing, foraging requires a lot of land and we have too many people now. But I suspect there are telic reasons for cultural evolution - we have insights and awarenesses that foragers don't (and to some extent vice versa). I think it's going somewhere and am more optimistic than Zerzan.
About your cat - I concur that I hesitated a long time about play-fighting in social carnivores. Is this make-believe, or not? I finally decided it is different from human pretend play. Animals just play-fight. I don't think they have de-coupled ideas about play-fighting representing real fighting, as when children play cops and robbers, pretending to be what they are not. Would an animal straddle a broom and pretend it is riding a horse? Would a child jump out of a high window, pretending she can fly? Or swallow stones, pretending they are sweets? There are two distinct perceptions here - one real, the other imagined. Some call this "symbolic" play and talk about "symbolic culture" and archaeolgical evidence for "symbolic behaviour", but I think "symbolism" is a chimerical and inadequate concept. For example, "symbolic culture" regularly comprises a vast range of social displays including song (wordless among e.g. Bushmen and "pygmies"), music, and dance. Whilst such displays can assume representational features (as in Ballet and Maori hakas) that is secondary - they are not essentially "symbolic".
A corollary of social mirror theory is the "play and display" hypothesis proposing that social displays were a major driving force in human encephalization and the evolution of self- and other-awareness.
Thanks again for your comments.
Perhaps instead of Renfrew's sapient paradox we would be in a better place investigating the "Homo sapiens paradox"? Namely, the range and variety of behaviours that characterise Homo sapiens since their probable inception 300,000 years ago. Some of these behaviours seem to overlap with those of Neanderthals whereas others don't? But where the line is to be drawn is still an ongoing issue.
Sorry, I missed your post. However, I think you are right. The problem concerns not only the variety of behaviors people are capable of but how they change and why. Why do we find stability (which is condemned by the west as evidence of inferiority, why not superiority? Is that not evidence of sustainability? I wrote a whole book on the dilemma anthropologists face regarding this - especially now working for tech firms where change is all - ).
It reminds me of the controversy over Malinowski's book edited by Kaberry on culture change. Gluckman found it complete nonsense, a capitulation to colonialism in essence, condemned also by Harris in his Rise of Anth...But both forgot how Nadel and Rivers found attempts to isolate indigenous people from European exploitation and control were futile. Harris saw Malinowski opting for a theory of no-change in his book, but like Nadel the idea was the difference in power. Today we see how some scholars argue that the results of Nadel's efforts to isolate the Dinka and Nuer only limited infrastructure development and made them less able to resist Arab suppression. Endogenous change vs exogenous change is the point and one Malinowski was trying to make. The issue was that Nadel had no power regarding what the colonial Sudan administration was going to do or promote. The papers of the colonial agencies and missionaries are rife with ideas for development but always controlled by others.
The same is true with experiments like the Vicos Project in Peru where a hacienda was sold to the indigenous peasants to run. Surrounding haciendados saw it as a threat to their control of their own native serfs and the project generated considerable tension, conflict and debate in aid circles over its appropriateness. The issue really was power, power was given to the natives and in a society whose hierarchical structure is based on the distribution of power and therefore wealth, the project was by that measure naive. Still it accomplished a lot those assessments focused on absurd factors (for the context) like health (https://pdf.usaid.gov/pdf_docs/PNAAJ616.pdf.
My new book (Big Brains and the Human Superorganism) abandons the anthropocentrism we usually apply to our view of our selves, and attempts to set human behavior in the context of other animal societies and the nature of complexity.
In the original question I posed at the beginning of this post, I suggested that neuro-cognition continues to be important to understanding human behavioural evolution, despite the new insights coming from demographic effects. A recent paper by Neubauer, Hublin Gunz, 2018 The evolution of modern human brain shape, ,Science Advances :Vol. 4, no. 1, eaao5961 DOI: 10.1126/sciadv.aao5961 seems to support this proposition. This paper shows that Homo sapiens brain shape changed gradually from its inceptions 300,000 years ago gradually becoming more globular. This change seems to be mirrored in the archaeological record from the Middle Stone Age and the transition from the Middle Stone Age to the Late Stone Age and from the Middle Palaeolithic to the Upper Palaeolithic. As the authors' state: the “human revolution” just marks the point in time when gradual changes reach full modern behavior and morphology and does not represent a rapid evolutionary event related to only one important genetic change that leads to a rapid emergence of modern human brain morphology and behavioral modernity. In terms of Renfrew's sapient paradox, this suggests a gradual accumulation of neural complexity that possibly went hand in hand with the accumulating cultural artefacts.
Excellent, that parallels my findings in my new book, Big Brains and the Human Superorganism.
@ Derek Hodgson @ Niccolo Caldararo
Thanks for the book and modern human brain shape refs. However Neubauer et al make the common mistake of assuming that a "human revolution" must be the result of some sudden (and un-Darwinian) genetic change. Indeed few if any social anthropologists would accept that culture of modern human type can be explained genetically - though of course must require genetic pre-adaptation for the change. Just consider e.g. sexual modesty which is present in some form in all human societies examined to date. No self-respecting ape would think it a good idea to conceal the genitals. As Marshall Sahlins pointed out "In apes, sex controls society; but in humans, society controls sex." He inferred a revolutionary change which turned an ape-like social order on its head. Or consider human cooperation, which biologists sometimes call "generalised reciprocity" even though it is not exclusively reciprocal. Rescue workers often risk and lose their lives attempting to save total strangers from whom they expect nothing in return. The behaviour of the suicide bomber is equally "anti-biological". Kin-based and reciprocal altruism are the major forms of within-species cooperation that can be explained genetically. I think it is no coincidence that human societies have formal systems of inflated kinship (from classificatory kinship to nationality) and reciprocity (economic systems many of which baffle professional economists). These are cultural inventions hard to explain exclusively in selfish-gene terms .
Neubauer et al also reveal their cognocentric bias by ignoring major social and spontaneous functions of brain structures they mention. For example, the precuneus as a central hub of so-called default activity. This implies daydreaming hence "theatre of mind" including social scenarios with toy people who act as though they have minds of their own. Hence role-play and the ability to run multiple minds in parallel.
Check out Ian Watts's research on haematite use, female cosmetic coalitions, male menstruation rituals, ritual/speech co-evolution theory, and the play and display hypothesis.
Thanks Charles, but I do think that it depends on what you call "hiding genitals" and all insect societies control sex to some extent more in some than others. Birds are another parallel especially since most male birds have not penis (just think how that would change human society!). I think that "day dreaming" "theater of mind" and rehearsal behavior is difficult to analyze in other animals (almost as much as in humans, see R.D Laing) but birds do demonstrate some very convincing similarities. I discuss all this in my book. There is considerable brain shape variation in humans, but in regard to gender and population (C. Mankiw,
, Allometric Analysis Detects Brain Size-Independent Effects of Sex and Sex Chromosome Complement on Human Cerebellar Organization. J. Neurosci. 37, 5221–5231 (2017). pmid:28314818
Also, variation seems to scale by brain size: http://science.sciencemag.org/content/early/2018/05/30/science.aar2578.full. Too often we find some association in our data and decide that it applies to all people or has some evolutionary significance. We shall see.
To Charles Whitehead,
I don't think Neubauer et al were making 'the common mistake of assuming that a "human revolution" must be the result of some sudden (and un-Darwinian) genetic change'. What they seem to be suggesting is that there was probably a feedback mechanism between niche construction, genetic factors, and the gradual globularization of the brain over a period of 300,000 years. As Neubauer et al state: 'It is intriguing that the evolutionary brain globularization in H. sapiens parallels the emergence of behavioral modernity documented by the archaeological record'. This relationship began with the inception of the Middle Stone Age around 300,000 years ago right up to the Late Stone Age 40,000-50,000 years ago. That feedback mechanism seems to be related to neural rewiring and perhaps increasing multimodal neural networks. The stress here is, therefore, on 'gradual' and not some sudden revolutionary occurrence. Perhaps we need to think more along the lines of an interactionist model where the above factors interacted with growing and more stable population levels that gave rise to robust cumulative material culture.
@ Derek Hodgson @ Niccolo Caldararo
Argument alone doesn't usually change scientists' minds. What counts I think is the kind of evidence they are exposed to. I did a pilot survey of scientists' opinions on the origins of language and would like to do a larger study (this was in a self-selected group, twenty contributors to "The Social Origins of Language" OUP 2014, i.e. scientists reacting to Chomsky's non-social idea of a macromutation in a single individual). In this group, social anthropologists and archaeologists all thought language originated relatively suddenly and was largely or entirely cultural in origin, as opposed to biologists, linguists, and psychologists who thought language emerged gradually and was much more influenced by genetic changes. Two robotics engineers were extremely divergent outliers.
What archaeological evidence shows depends on what you regard as relevant. Ian Watts found a tenfold increase in the use of red ochre/haematite around 110 kya in South Africa which he interpreted as a marker for the arrival of culture of modern human type (characterised by ritual and both regulatory and institutional rules - the latter as required e.g. for the institution of marriage or a dual moiety matriarchal clan system). Of course Africa is a big place and I think modern culture is probably older then 110 thousand years.
A couple of comments:
Niccolo: Birds and insects are not apes. Concealment of the genitals means e.g among Kalahari Bushmen the climate renders clothes unnecessary but nevertheless they cover their "private parts" with a small leather apron.
Derek: I don't think male menstruation can be explained by brain rewiring any more than the industrial revolution. The brain in any case is a self-sculpting organ (two phases of arborization followed by axonal and neural pruning).
See earlier contributions to this thread (originally about Renfrew's "sapient paradox")
I do look at neuroscience, psychology, and evolutionary biology and I think globularization is interesting, but I wonder if either of you checked out any of the topics I mentioned?
If we think of human flexibility as a form of cognitive bandwith, we may be better placed to appreciate the gradual increase of material culture over the long term. As Kissel and Fuentes (2018) just published paper outlines, and as I demonstrated in earlier papers, the cognitive precursors for cultural artifacts may even predate the arrival of anatomically "modern" humans 300,000 years ago. With the arrival of anatomically modern humans, however, we see a coming and going of material culture that often stretches and tests the limits of this bandwith throughout 300,000 years. The cognitive bandwith of Neanderthals, however, may not have been as broad as that of Homo sapiens to the extent that they were unable to take advantage of raised population rates that Homo sapiens were able to. Thus, Homo sapiens were able to exploit the innovations that came with larger population rates and increasing interaction between groups. As Kissel and Fuentes state, this was part of a long drawn out process involving multifactorial reciprocities and not tied to a specific event. Such insights seem to complement Neubauer,et als' (2018) findings mentioned above.
Hodgson, D. 2010. Determining the behavioural profile of early modern humans: assimilating population dynamics and cognitive abilities. Before Farming. 2 (1).
http://www.waspress.co.uk/journals/beforefarming/journal_20102/index.php
Hodgson, D. 2013. Cognitive Evolution, Population, Transmission, and Material Culture. Biological Theory. 7: 237-246. http://link.springer.com/article/10.1007/s13752-012-0074-y
Kissel, M. and Fuentes, A. 2018. ‘Behavioral modernity’ as a process, not an event, in the human niche, Time and Mind, 11:2, 163-183, DOI: 10.1080/1751696X.2018.1469230
Here are Renfrew's words about the "Sapient paradox" (Renfrew, Colin. "Neuroscience, evolution and the sapient paradox: the factuality of value and of the sacred." Philosophical transactions of the Royal Society of London. Series B, Biological sciences vol. 363,1499 (2008): 2041-7. doi:10.1098/rstb.2008.0010).
"It may be appropriate to speak of the Sedentary Revolution" - "the development of sedentism and then of agriculture ..." "Why did it all [the Sedentary Revolution] take so long? If the sapient phase of human evolution was accomplished some 60 000 years ago [...The biological basis of our species has been established for at least some 60 000 years ago... ], why did it take a further 50 000 years for these sapient humans to get their act together and transform the world? That is the sapient paradox."
In other words, the "Sapiens paradox" is the question of why it took around 50 thousand years, from the date when Homo Sapiens became behaviorally modern to the Agricultural revolution, which happened about 10000 BC? A lot of works studied the "Sapiens paradox." It looks like a paradox if out of, say, 60 thousand years, almost nothing happened in the first 80% of the time, and, later, things started to happen in abundance.
I would argue that such an interpretation of what happens looks like a "paradox" only if you assume that processes in Nature and human society occur linearly. Please note that many processes in Nature and human society follow exponential or even hyperbolic curves. For example, human population growth followed a hyperbolic curve until recently.
In my book "Subsurface History of Humanity: Direction of History," you will find out that, based on vast scholars publications about archeological and other artifacts, the whole global history of humankind could be graphed as an exponential curve starting from the date when Homo Sapiens became behaviorally modern. In my book, I choose the date 42000 BC, when the first known partly-human, partly-animal figures were drawn, as a starting data point. Please note that the curve will not change much if you choose 60000 BC instead.
With an exponential curve, the "Sapiens paradox" does not exist.
An explanation of why an Agricultural Revolution happened when it happened is in my book.
To Victor,
As Peter Hiscock mentions in an early post to the present question, the issue of when humans became "behaviourally modern" (a concept now dismissed by many archaeologists) is far from straightforward, In other words, there was a lot going on in the brains and material outcomes from 300,000 years ago onwards that still remains to be unpacked. It is becoming clear, however, that we have to be careful when talking about "revolutions" and thresholds as suchlike have been found to be somewhat slippery and the same probably applies to the agricultural revolution. For example, there was a great deal of innovation taking place in South Africa from 100,000 years onwards. This may be related to subtle changes to the cortex in Homo sapiens sapiens around 125,000 years ago as Hublin and associates suggest.
To Derek Hodgson - I mentioned "revolutions" only because Renfrew was using the term "revolution". You will not find any revolutions, whether French or Agriculture, wars, empires periods, or similar events in my book.
I tend to think of psychological factors as proximate not ultimate causes. Cognitivists seem to treat them as if they are ultimate, as if cognitive factors drive evolution rather than evolving in response to changing selection pressures. Isn't there a fundamental confusion here?
To Derek Hodgson - Even if you will dismiss "behavior modernity" and will start consider "what was going on in "material outcomes" in humanity's history since 300 thousand years ago, that would not change much the shape of the exponential curve of humankind development.
To Peter J Richerson, Yes, but the recent material turn i(4E's), in archaeology does not recognize a strict demarcation between cognition and the material that undergoes transformation in the process of humans constructing an artificial environmental niche. The "mind" is viewed as extended into the artefact produced, which in turn restructures the cortex in a kind of never ending loop. According to material engagement theory, there is an intimate relationship between human psychology and the objects that go towards making up a culture.
At least during the Pleistocene, increases in brain size and cultural complexity look pretty linear to me. The Holocene, by contrast, looks hyper-exponential.
To Peter. More people, more innovation, more diversity in culture, where cultures were able to "compete" with one another on a number of levels during the Holocene. The human brain must therefore already have been primed to engage in such diversity sometime before the Holocene, perhaps due to its flexibility to engage in social behaviour and share information during the Late Pleistocene. The material signal in the archaeological record during the late Pleistocene ebbs and flows depending on circumstances possibly related to changes in population levels and densities. This is reflected in the innovations that occurred in South Africa around 80,000 years ago with the invention of traps, evidence for bow and arrow technology, making of glue etc.
To Victor. Surely we need to be careful as to what is meant by "development". Bit of a loaded term I would say?
To Derek Hodgson - Yes, "development" is a loaded term. However, I do not see a better term to describe what happens with humankind during, say, the last 44 thousand years.
To Peter J Richerson - >"At least during the Pleistocene, increases in brain size and cultural complexity look pretty linear to me. The Holocene, by contrast, looks hyper-exponential." - Yes, the beginning of the exponential curve always looks like a linear line. It is hard to see the difference unless you look into the picture, which includes both slow and fast-growing phases of an exponential curve. In my book and one of my articles (please visit links in my RG profile), you can see the history of humankind during the last 44 thousand years in one graph.
To Derek: the trouble is, there are often multiple ways that that proximal mechanisms can drive behavior. For example, in seasonally migrating species, it might be day length or temperature that triggers migration. I think the trouble with cognitive explanations is that is that we have no method for observing cognition other than the behavior it causes, even in living people. Fossils present a whole 'nother level of difficulty. As the behaviorists observed long ago, mentalistic explanations are just speculative philosophy of mind in the absence of the ability to observe mental structures. If artifacts are all we have our theory should be about artifacts, not the unfortunately non-observable mental devices that engender artifacts.
To Victor: I didn't realize that you were talking about only the last 44,000 years, not the whole Pleistocene. You are right I think that the Holocene is exponential, I'd rather say hyperexponential, and this might stretch back into the latest Pleistocene. see my work with Russ Genet: https://learn.culturalevolutionsociety.org/human_systems_module/intro. see especially the fifth model that has full blown hyperexponential behavior.