Terrence Deacon's Incomplete Nature
The Problematic Matter of Mind
A review of Terrence Deacon's Incomplete Nature: How Mind Emerged from Matter (624 pages, W. W. Norton & Company; November 21, 2011)
According to Julian Jaynes, our ancestors first became cognizant of the problem of mind when they literally stopped hearing the voices inside their heads. Prior to that, conscious experience as we know it did not exist. Instead human cognitions amounted to the poetic directives of aurally hallucinated 'gods' emanating from the right cerebral hemisphere. This did not change until humans developed, via the left hemisphere and written text, the linguistic sophistication of describing experience metaphorically, which occurred in response to socio-cultural changes brought on by the development and collapse of the first civilizations.
Not coincidentally, in ancient Greece, the silencing of the right hemispheric “god voices” was accompanied by the rapid ascent of philosophy, math, and science: the mental constructions upon which contemporary Western civilization is founded.
During this transition the meanings of the Greek words soma and psyche, which originally referred literally to ‘body’ (or ‘corpse’) and ‘life’ (or ‘breath’), evolved into a metaphorical dichotomy: body and soul. And so was born the pernicious notion of dualism. Quoting Jaynes (The Origin of Consciousness in the Breakdown of the Bicameral Mind, 1rst Edition, p. 291):
“But the matter does not stop there. In Pindar, Heraclitus, and others around 500 B.C., psyche and nous [‘mind’ or ‘thought’, from noos, literally ‘seeing’] begin to coalesce. It is now the conscious subjective mind-space and its self that is opposed to the material body. Cults spring up about this new wonder-provoking division between psyche and soma. It both excites and seems to explain the new conscious experience, thus reinforcing its very existence. The conscious psyche is imprisoned in the body as in a tomb. It becomes an object of wide-eyed controversy. Where is it? And the locations in the body or outside it vary. What is it made of? Water (Thales), blood, air (Anaximenes), breath (Xenophanes), fire (Heraclitus), and so on, as the science of it all begins in a morass of pseudoquestions.
“So dualism, that central difficulty in this problem of consciousness, begins its huge haunted career through history, to be firmly set in the firmament of thought by Plato, moving through Gnosticism into the great religions, up through the arrogant assurances of Descartes to become one of the great spurious quandaries of modern psychology.”
Thus for Jaynes, consciousness is problematic because it is entangled with a misconception—dualism—whose modern incarnation stems from Rene Descartes’ pronouncement that mind is an immaterial yet somehow substantial entity (‘soul’) that comes to reside in a human body (in the brain). Descartes was a contemporary of Galileo, and he no doubt felt the Inquisition breathing down his neck. It can be argued that his dualistic stance served to placate the Church, which retained authority over the soul while science (natural philosophy) pursued its agenda of discovering the natural causes (‘mechanisms’) governing the body.
But then along came Darwin, whose theory of evolution by means of natural selection renders the idea that human beings are the sole beneficiaries of mind unintelligible; and some would say, the problem of mind moot.
For within the Darwinian paradigm, it is quite easy to rationalize mind as being nothing more than a set of neural mechanisms that evolved because of the reproductive advantage they afford. This is the position of my friend Pcunix, who has argued that mind is therefore a non-problem.
The argument is essentially thus: organisms can be adequately explained in terms of mechanisms; mind is a feature of organisms; therefore mind can be adequately explained in terms of mechanisms. That mind is a feature of organisms (of animals at least) is self-evident; hence, the argument rests on the contention that organisms can be adequately explained by mechanisms.
Is that true?
To answer that question requires that we define both mechanism and organism, and that we do so predicatively, i.e. such that ‘mechanism’ is not used in our definition of organism.
A mechanism can be broadly defined as any process or configuration that behaves deterministically—that is, in a way that is mathematically predictable from sufficiently specified initial and boundary conditions. Anything that is predicted by the laws of physics (Newton’s Laws of Motion, the Second Law of Thermodynamics, Maxwell’s equations etc.) fits this definition, as does the evolutionary process of natural selection. A machine is a mechanism; but not all mechanisms would be described as machines, at least in the usual sense of the word (natural selection being an example).
So a definitive attribute of a mechanism is that it is deterministic.
Defining ‘organism’ is not nearly so easy. If anyone tells you that it is, you can rest assured that that person is deluded. To define ‘organism’ is to define life itself. Many books, written by some of the smartest people in the history of science, have struggled with that subject. But for the sake of this discussion we can at least specify some of the definitive features of all known organisms, without loss of generality (but perhaps at the risk of over-generalizing to include things that are not really organisms).
An organism can be broadly defined as any autonomous process or configuration that develops into existence (i.e., comes to be via a trajectory of growth and self-organization that transforms a relatively small and/or vague progenitor, such as a single undifferentiated cell, into a much larger, more powerful and/or distinctive entity, often composed of many functionally differentiated cells), and that works to both maintain homeostasis (persist and adapt under changing and often unpredictable circumstances) and reproduce. Any autonomous agent that can be said to be alive fits this definition.
Hence a definitive attribute of an organism is that it is developmentally self-organizing and physiologically self-maintaining over a range of somewhat (though not entirely) unpredictable conditions.
We can now ask: are organisms adequately explained by mechanisms?
The answer is a no. A close examination of our definitions reveals why: a mechanism, by definition, is fully determined (by physical laws, and just as importantly, by initial and boundary conditions), whereas an organism, by definition, is not. That is in fact why it is easy to define the former and not so easy to define the latter.
Now some might object that that only reflects that we have prohibited reference to ‘mechanism’ in our definition of ‘organism’. Without that prohibition we could for example define organism as ‘a complex organic mechanism’. But this type of reasoning is impredicative, and hence of no use in answering our question—it simply assumes as true that which we seek to ascertain.
I expect that many believe this to be a safe assumption. Organisms are physical (material) beings that are to some extent predictable, and predictability in physical systems—i.e. real systems—implies determinism, and hence mechanism. So what is predictable in biology is by and large if not entirely mechanistic. But biology is not entirely predictable, and is in fact largely indeterminate with respect to the laws of physics. That does not mean that it violates those laws, or even that it involves anything substantial that is not physical. It only means that the laws of physics do not and cannot predict biology. No matter how far the science of physics progresses, it will never be able to predict biology. Biology is not applied physics, is not derivable from physics, and will never be a branch of physics. These days most physicists (if not most biologists) know this to be true. As noted by Nobel prize-winning physicist Philip W. Anderson in 1972, physics is a science of symmetry, but life breaks symmetry. And it does so in ways that are completely arbitrary.
But there is another problem with trying to explain organisms entirely in terms of mechanisms, which is this: all organisms act intentionally, whereas mechanisms do not. By this I mean that organisms make arbitrary choices that are directed toward bringing about some end that can reasonably be described as being desired by the organism. No mechanism that is not created by a living system can be said to do this. It might be argued that computers can, but computers are mechanisms created by human beings, and the choices that they make are all constrained by choices that their human engineers and programmers made in order to achieve a desired end. Intentional action—a defining attribute of life and mind—cannot be viewed as intrinsically mechanistic except by assumption (i.e. belief). And (to my mind at least) that is an assumption that is as absurd as belief in a supernatural deity.
So the problem of mind is this: although the anatomy, physiology, development and evolution of an organism can be described in terms of mechanisms, those mechanisms are merely means to an end that the mechanisms themselves do not intrinsically specify. This is even true for the mechanism of natural selection, which is merely a means to the end of equilibration—that is to say, natural selection is a servant of the Second Law of Thermodynamics, that most non-mechanistic of physical laws (note that the phrase ‘statistical mechanics’ is oxymoronic, but we’ll leave that for another hub).
Science only deals with that which is empirically or mathematically verifiable, which is why concepts like ‘soul’, ‘god’, and ‘intelligent design’ are not scientific. The problem then is this: if intentional causation is not mechanistic, how can it be explained by science?
That is the problem that Terrence Deacon attempts to solve in his new book Incomplete Nature: How Mind Emerged from Matter. Does he succeed? To find out, read on.
The absential nature of causal constraint
Deacon’s thesis is that solving the problem requires that we conceive mind not as a presence, as it is usually conceived, but rather as an absence. In other words, we need to frame the problem in terms of constraint rather than mechanism. There is an important difference between the two approaches. A mechanism is fully determined, whereas a constraint is simply anything that reduces the number of possibilities or degrees of freedom (i.e., an asymmetry). So something can be constrained but not fully determined. Anything that can be defined at all is constrained in some way, because definition minimally implies that a thing is what it is because it is not something else.
As noted above, an organism cannot be adequately explained in terms of mechanisms. But perhaps it can be adequately explained in terms of constraints—what Deacon refers to as ‘absentials’.
Deacon likens this insight to that which followed upon the incorporation of zero into mathematics. Absence has not always been represented by a numerical symbol; for example, there is no Roman numeral for zero. Deacon notes that the numerical representation of absence was contested by mathematicians in ancient Greece. It was however recognized and used by mathematicians in ancient India (that hotbed of mysticism!), and eventually migrated to the Middle East via Arab trade routes, to be incorporated into modern (Arabic) numerology. Nevertheless European scholars resisted the very notion, even into the middle ages, as being ‘magical’—a resistance not unlike that displayed by contemporary reductionists who pejoratively refer to non-mechanistic explanations of causality as being ‘magical’ or ‘mystical’.
Regarding such cognitive entrenchment Deacon writes (p. 8-10):
“The difficulty we face when dealing with absences that matter has a striking historical parallel: the problems posed by the concept of zero. ...[O]ne of the greatest advances in the history of mathematics was the discovery of zero. A symbol designating the lack of quantity was not merely important because of the convenience it offered for notating large quantities. It transformed the very concept of number and revolutionized the process of calculation. In many ways, the discovery of the usefulness of zero marks the dawn of modern mathematics. But as many historians have noted, zero was at times feared, banned, shunned, and worshiped during the millennia-long history that preceded its acceptance in the West. And despite the fact that it is a cornerstone of mathematics and a critical building block of modern science, it remains problematic, as every child studying the operation of division soon learns.
“For medieval mathematicians, zero was the devil’s number. The unnatural way it behaved with respect to other numbers when incorporated into calculations suggested that it could be dangerous. Even today schoolchildren are warned of the dangers of dividing by zero. Do this and you can show that 1 = 2 or that all numbers are equal. In contemporary neuroscience, molecular biology, and dynamical systems theory approaches to life and mind, there is an analogous assumption about concepts like representation and purposiveness. Many of the most respected researchers in these fields have decided that these concepts are not even helpful heuristics. It is not uncommon to hear quite explicit injunctions against their use to describe organism properties or cognitive operations. The almost universal assumption is that modern computational and dynamical approaches to these subjects have made these concepts as anachronistic as phlogiston.
“So the idea of allowing the potentially achievable consequence characterizing a function, a reference, or an intended goal to play a causal role in our explanations of physical change has become anathema for science. A potential purpose or meaning must either be reducible to a merely physical parameter identified within the phenomenon in question, or else it must be treated as a useful fiction only allowed into discussion as a shorthand appeal to folk psychology for the sake of non-technical communication. Centuries of battling against explanations based on superstition, magic, supernatural beings, and divine purpose have trained us to be highly suspicious of any mention of such intentional and teleological properties, where things are explained as existing “for-the-sake-of” something else. These phenomena can’t be what they seem. Besides, assuming that they are what they seem will almost certainly lead to absurdities as problematic as dividing by zero.” [Italics in original]
But how can we conceive of mind as absence? Deacon’s approach to answering this question is to systematically examine the how constraint can be causative, beginning from first principles of thermodynamics. He does this in the last twelve of his seventeen chapters comprising five hundred some odd pages, after spending the first five chapters framing the problem historically, and critiquing how previous attempts involving reductionism, evolutionary theory, information theory, computer science, cybernetics, dynamic systems theory, and neurobiology have fallen short. I do not need to spend any more verbiage on the critique, and so will cut to the chase.
Change that is spontaneous versus change that requires work
A key clarifying insight comes from recognizing that there are two fundamentally different kinds of change: that which occurs spontaneously and that which requires work. Deacon refers to these respectively as ‘orthograde’ change and ‘contragrade’ change. Examples of orthograde change are a car rolling downhill, and an ice cube melting in warm water. Examples of contragrade change are a car being driven uphill, and a freezer making ice. It should be obvious from this that orthograde change implies an absence of constraint along a potential gradient, whereas contragrade change implies the presence of constraint that must be overcome for the change to occur. In other words, anything that can be characterized as work implies the presence of constraint—i.e., the presence of absence.
What sort of absence is overcome by work? The answer is of course ‘absence of potential for spontaneous change’. Change that requires work is change that will not happen without work. But that doesn’t mean that the change is impossible; it simply means that effort (expenditure of free energy) is needed to create conditions that will allow it to happen spontaneously.
How does that work? According to Deacon, work occurs whenever a constraint in one system is juxtaposed with orthograde change in another, such that the latter overcomes the former. Thus (for example), a chemical reaction that is prevented by a potential barrier (‘activation energy’) between reactants and products can be made to occur by heating the reactants. In this case the juxtaposition of the reactants with a source of heat drives the contragrade change of overcoming the constraint of the potential barrier. Of course, once the work (activation energy) needed to overcome that barrier is accomplished, the chemical reaction becomes orthograde, as there is no longer a constraint preventing it from happening. That’s what happens with explosions.
And that is why the Second Law of Thermodynamics, an orthograde tendency toward equilibration (disorder), does not prevent the spontaneous emergence of order. As shown by Ilya Prigogine, all that is needed for the latter to occur is that the disordered (and thus constrained) system be hooked up to a larger unconstrained system. The orthograde change of the latter then works to drive contragrade change in the former. Deacon refers to the spontaneous emergence of a form by way of thermodynamic work as ‘morphodynamics’. The Second Law does not prohibit the spontaneous emergence of order because morphodynamics emerges from thermodynamics, via the asymmetric juxtaposition of orthograde change occurring in different non-equilibrium systems. The result is the creation of higher-order constraints that previously did not exist—a process that I refer to as development. This does not explain how teleology (intentional, goal-directed causation) emerges from physical systems. But it suggests an approach toward answering that question.
Teleodynamics and teleodynamic work
Deacon’s approach is as follows: if morphodynamics results from asymmetric juxtaposition of opposing orthograde tendencies of different thermodynamic systems, perhaps we can explain teleology, and thus ultimately mind, by considering what happens when different morphodynamic systems are asymmetrically juxtaposed. Deacon does this in two ways: first, by articulating a hypothesis for what is minimally required of matter to generate goal-directed behavior (what he refers to as ‘ententionality’, which is a precursor to the more developed ‘intentionality’ of sentient creatures), which speaks directly to the problem of abiogenesis (the origin of life); and second, by generalizing the concept of work to show how the colloquial usage of the term (as in mental work) extends from the technical meaning of the term in physics (as in mechanical work, or force times displacement).
Deacon coins the term ‘teleodynamics’ to refer to constrained dynamics that produce ententional phenomena. His solution to the problem of what constitutes the minimal teleodynamic system is the ‘autogen’: molecular autocatalysis coupled (juxtaposed) to molecular self-assembly. Autocatalysis occurs via cyclic configurations of orthograde reactions, that is, when the products of a thermodynamically favored reaction are reactants of subsequent reactions that then produce reactants for the first. Autocatalysis fuels growth, but is constrained by diffusion, as it requires that the reactants be in close proximity. Self-assembly occurs via orthograde (thermodynamically favored) aggregation of molecules into metastable structures. Self-assembly also fuels growth, but is constrained by the availability of molecular components.
Autogenesis is what happens when autocatalysis produces the components for self-assembly, which in turn produces a container that limits diffusion, thereby favoring autocatalysis. Such a configuration allows the constraints that impede continuance of each process (autocatalysis and self-assembly) to be overcome by the other process—each completes the other. It is teleodynamic because the configuration establishes constraints that entail a ‘goal’—its own continuation—under uncertain circumstances: conditions favoring growth afford for reproduction via autocatalysis, whereas growth-limiting conditions do not bring about its demise owing to the long-term stability of the self-assembled structures.
Autogens are not organisms and are not alive, but their teleodynamic nature is lifelike and could provide a predisposition for life. As it stands Deacon’s proposal for ‘autogenesis’ is a theory in need of empirical support. It is however empirically testable (and hence scientific), and provides a novel conceptual framework for thinking about the origin of life. One of the attractive features of the theory is that it is consistent with both on-earth and extraterrestrial scenarios of abiogenesis. One can imagine that autogenic configurations may have arisen on primordial earth; but owing to their inherent stability, one can as easily imagine that they could have been transported here through space.
Teleodynamics, as embodied by an autogen, is the result of morphodynamic work, which is accomplished by the asymmetric juxtaposition of orthograde change enabled by two different morphodynamic processes (autocatalysis and self-assembly). To explain how mind emerges from teleodynamics, Deacon extends this further by postulating that asymmetric juxtaposition of different teleodynamic systems can accomplish teleodynamic work, and that when this occurs recursively through evolution it creates the higher order constraints that we recognize as mind.
The detailed explanation of how this works occupies the last six chapters of the book (Chapters 12-17): Information, Significance, Evolution, Self, Sentience, and Consciousness. I won’t go through it here, except to note that Deacon’s approach is semiotic, extending directly from the work of the American logician Charles Sanders Peirce (1839-1914). In semiotics information is not something that exists in isolation, but only relationally via the interpretation of signs. From this perspective the technical (quantitative) definition of ‘information’ developed by Claude Shannon is incomplete, because it ignores some of the definitve attributes of information: the fact that it is both referential and meaningful. Thus, the data stored in computers (or for that matter in DNA) is only information to the extent that it is rendered as an interpretation by living systems (e.g. human minds). The evolution of mind through self, sentience, and consciousness can only be understood semiotically. The number of people who are actually cognizant of that fact remains exceedingly small. Fortunately, the number is growing, and Deacon’s book is a valuable addition to the discourse fueling that growth.
Is the problem solved?
Deacon does an excellent job of framing the problem of mind, which is a necessary first step toward solving it. But after the expenditure of many words (perhaps too many) and the creation of a fair number of new ones, he does not solve it. Some might object to his penchant for coining terms, as if that explains anything—it does not. But I see this as valuable nonetheless, in part because so many of the old words (e.g. ‘teleology’, ‘intentionality’) are loaded with historical baggage that impedes their effective use in scientific discourse. Perhaps coining new words can rescue the baby from the bathwater. Beyond that, it is apropos to coin new terms in the service of solving the problem of mind, simply because the problem is semiotic, and hence linguistic. If Jaynes is to be believed, human consciousness was created by way of metaphor.
There is a deeper problem that Deacon skirts by way of a straw-man critique in the first few chapters of the book, and by simple assertion later in the book: the nature and meaning of subjectivity. It can be argued (as I have) that subjectivity is an aspect of the material world that is refractory to objective discourse. Deacon criticizes such panpsychist approaches to the problem by arguing that they do not really explain subjectivity. But this does not give credit to the best contemporary philosophers of panpsychism (e.g. Freya Mathews, David Abram), who have advanced good arguments for why any attempt to explain subjectivity in objective terms is misguided. For Deacon, subjectivity is essentially synonymous with human consciousness. For various reasons that I won’t go into here, I beg to differ, and Deacon fails to make a case that changes my mind.
But perhaps the fact that Deacon does not ‘solve’ the problem of mind actually follows from his thesis. If mind is not a ‘presence’, but rather an ‘absence’—a hole of potential filled with nothing but possibility—then it may well be an intractable problem in the sense that it has no definite solution. Unlike a mechanism, life and mind are undetermined by nature.
It’s been said before
My biggest criticism of Incomplete Nature is that much of the substance of Deacon’s argument has been advanced before, in some cases more rigorously, by scholars whom he fails to acknowledge. The work of Robert Rosen (e.g. Anticipatory Systems: Philosophical, Mathematical, and Methodological Foundations and Life Itself: A Comprehensive Inquiry into the Nature, Origin, and Fabrication of Life) is particularly germane to this discourse, and yet not mentioned by Deacon. Deacon's discussion of hierarchically organized constraint echoes Stanley Salthe's groundbreaking work on hierarchy theory, developed in his books Evolving Hierarchical Systems and Development and Evolution: Complexity and Change in Biology, neither of which Deacon cites. I am not the only one to criticize Deacon’s scholarship: in his Nature review, Evan Thompson faults Deacon for not citing his own similar arguments (Mind in Life), or those of Alicia Juarrero (Dynamics in Action: Intentional Behavior as a Complex System). While I have not yet read the former, I have now read the latter and I have to agree. In fact, major elements of Deacon's argument so closely parallel Juarrero's that it is hard to believe that that is mere coincidence (see postscript below).
Be that as it may, Incomplete Nature is an important book simply because it addresses a difficult (perhaps ultimately intractable) problem that many scientists would rather not think about. I think that it does offer original insights, or at least important insights articulated in a new way. There is a lot of convergence on these ideas now, and it is not unusual for deep thinkers to sometimes forget where they acquired their thoughts.
The bottom line for me is this: a good deal of ‘teleodynamic’ work is needed to overcome the constraint of cognitive entrenchment wrought by reductionism, a constraint that wreaks ecological havoc even as it enables technological wonders. If Deacon’s book works toward that end, then it is doing an important and useful job.
Postscript: my take on the Juarrero-Deacon controversy
Since the publication of Incomplete Nature over a year ago, the controversy over its originality—particularly its alleged rehash, without attribution, of themes and arguments advanced by Alicia Juarrero in her book Dynamics in Action—continues to rage. Since I was taken to task by several commenters (including Deacon himself) for criticizing Deacon’s scholarship and uncritically accepting comments stoking the controversy, I have, as promised, made an effort to carefully reconsider the matter. Here is what I have come to conclude.
I have read both books from cover to cover, and I stand by my original claim that Deacon covered much of the same ground covered previously by Juarrero. That ground includes an extended, systematic critique of reductionist attempts to explain (away) mind as a mechanically determined epiphenomenon, but also (and more importantly) the central thesis of both books, which is that a more realistic and potentially fruitful approach is to view intentionality as something that emerges by virtue of hierarchically organized constraints on flow. The question then is whether that ground can reasonably be considered to be ‘public domain’—i.e. so well trod that attribution is not needed. To some extent I would say yes: the criticism of reductionism has indeed been covered by many writers, as has the idea that hierarchical organization produces constraints that impede and thus direct flow. I suspect this is how Deacon rationalized not citing Juarrero (and others who deserved credit where none was given).
There are serious problems with this rationalization however. First, it contradicts Deacon’s strong claim to originality, which he proudly stakes in the book, and which has been promulgated by his fans. The problem with not citing those who developed similar ideas previously is that it misleads the naïve general reader (Deacon’s proclaimed intended audience!) into thinking that the claim of originality covers those ideas. Unfortunately, Deacon fails to carefully delineate which parts of his book are truly original, and which are derived.
The second problem with the ‘public domain’ rationalization is that in light of certain historical facts it strains credibility. Here I admit that I am privy to a bit of inside knowledge, thanks to having colleagues connected in one way or another to the controversy. The key fact of the matter is that both Deacon and Juarrero attended a conference in Mexico in 2007, where they both presented their work. This was several years after publication of Dynamics in Action, and well before publication of Incomplete Nature. I don’t know whether Deacon actually conversed with Juarrero, but he undoubtedly heard her lecture, and so was aware of her work when he wrote his book. To my mind this is the most damning fact of the matter: Deacon must have been aware that Juarrero had already advanced many of the arguments that he articulated in Incomplete Nature. In light of this, his failure to cite her work is unconscionably dishonest. Unfortunately, in the testosterone-saturated, ego-inflated halls of academia, this sort of thing is not at all uncommon.
The situation can be made completely transparent simply by considering its asymmetry, and asking what would have likely occurred had the roles been reversed. Deacon is a male professor at a highly-regarded research University (U.C. Berkeley), whereas Juarrero is a female professor (now Emeritus) at a relatively unknown community college (Prince George’s in Maryland). If Deacon had published a book in 1999 and presented it at a conference attended by Jaurrero in 2007, and then Juarrero had subsequently tried to publish a book presenting similar ideas without citing Deacon, do you honestly think she would have gotten away with it? Most likely her book would not have been published until she had rectified that scholarly lapse, and then it probably would not have been touted up as a major breakthrough, as Deacon’s book has been in some quarters.
So I stand by my original criticism. But, to be fair, we should try to make clear what is original in Deacon’s book. As he notes in his comment below, he, unlike Juarrero, makes a concerted effort to move beyond complex dynamic systems theory, which he in fact critiques for its limitations. He does so by way of Peirceian semiotics, which Juarrero does not bring in to her analysis. In addition, he comes up with an original materialist model for the origin of life and mind, involving "autogenesis", which provides a springboard for the development of "telogenesis". So whereas Juarrero was largely concerned with explaining human intentionality, Deacon seeks to explain the origin of intentionality in the universe. Finally, Deacon’s claim of originality is staked on the notion that mind would be better (more productively) viewed as a specified absence—i.e., a manifestation of constrained potential—rather than a specific presence. While Juarrero also bases her argument on the causal efficacy of contextual constraint, she does not explicitly make the connection between constraint and absence (the incompleteness alluded to in Deacon’s title).
Based on these differences, Deacon and his fans would have us believe that his conception of constraint—the "absential" essence of mind—is fundamentally different than Juarrero’s. He and his fans assert that his idea that intentionality is produced by "teleodynamic work" resulting from asymmetrically juxtaposed (i.e. constrained) “teleodynamics” produced by "morphodynamic work" resulting from asymmetrically juxtaposed “morphodynamics” is completely original and unlike Juarrero’s idea that intentionality emerges by virtue of “second order constraints” produced by the interaction between context and network dynamics (manifesting lower level or first order constraints). Sorry, I don’t see it: from my perspective Deacon’s originality here lies mostly in the coining of new words and in saying the same thing in a different (and perhaps in some ways more compelling) way. Moreover, Deacon’s own words belie his claim of moving beyond complex dynamic systems theory. Is not “teleodynamics” by definition a form of dynamics, specifically those constrained by higher levels of contextual organization? How do those constraints differ fundamentally from what Juarrero called “second order contextual constraints”?
In his acknowledgements Deacon thanks a group of academic compatriots that convened regularly at his house to discuss the ideas developed in his book, a club that he fondly refers to as “Terry and the Pirates”. How fitting is that? As I said in the above review (most of which I wrote before I was made aware of Dynamics in Action and the controversy) when I first read Incomplete Nature I was bothered by the fact that it echoed ideas developed and expressed by thinkers such as Robert Rosen and Stanley Salthe, but without attribution. So when I did become aware of the controversy I was not particularly surprised, as I had already smelled a fish. The tragedy here is that Incomplete Nature is a work of potential importance that is undermined by the failure of its author to acknowledge his sources.
P.P.S.: If you liked this...
Check out my book Global Insanity, co-authored with Don Mikulecky and published in November 2012 by Emergent Publications, which deals with many of the issues discussed here, but in the context of our current global predicament. Incomplete Nature was recommended to me by one of the reviewers of the initial draft of our book.
More by this Author
Modern Madness Explained--A review of Iain McGilchrist's The Master and his Emissary: The Divided Brain and the Making of the Western World (2009, Yale University Press).
The Search for Lost Gods and an Innocence of Certainty--A review of Julian Jaynes's remarkable theory of consciousness.
It is often said that the existential question “why are we here?” is beyond the reach of science. I disagree: science has already answered that question....