Joint work with Evelina Leivada is now out in Ampersand. We discuss 10 ambiguous, misused or polysemous terms in linguistics, including I-/E-language, entrainment, reference, ‘the neural basis of X’, (un)grammaticality, third factor, and labeling.
Part of the Abralin series of linguistics lectures: “A Neurocomputational Perspective on Syntax“.
‘How are basic linguistic computations implemented in the brain? Drawing on recent findings from the biological and cognitive sciences, I will propose a neurocomputational model of language comprehension, with particular reference to syntactic and semantic processing. Reviewing the current state of the art, I will defend a multiplexing model of cross-frequency coupling in language comprehension, viewing this higher cognitive capacity as being grounded in endogenous neural oscillatory behaviour. Recent findings from theoretical syntax and semantics will be consulted in order to more carefully frame the implementation and development of this neurocomputational architecture. Alternative accounts in the literature will also be evaluated.’
For more information about Abralin ao Vivo – Linguists Online, visit here.
Starting this week, Psychology Today are publishing new articles of mine under the column “Language and Its Place in Nature“, so more regular writings can be found there. The first piece is about how aging impacts language processing.
New theoretical syntax paper with Jae-Young shim on the status of categorial labeling and copies in Linguistic Research.
“In contrast to dominant views that the labeling algorithm (LA) detects (i) only the structurally highest copy of a moved object, or (ii) detects all copies, we propose and defend a third option: (iii) all copies are invisible to LA. The most immediate consequence of this is that objects formed by Internal Merge cannot serve as labels. We relate this proposal to a particular reinterpretation of LA theory such that LA constructs only categorial labels, barring the construction of and <φ, φ> configurations. We then propose an interface condition, Equal Embedding (EE), under which agreeing features must be equally as embedded in order for interpretation to be licensed. We argue that EE appears to fall out of minimal search requirements. We then propose a principled distinction between Agree and LA, based on their sensitivity to copies and interface relations: Both Agree and LA involve minimal search (Probe-Goal for Agree; categorial feature-detection for LA); however, copies are invisible to LA but not to Agree, and LA involves a CI relation (category-specific interpretation) whereas Agree involves an SM relation (the morpho-phonological process of feature-valuation”.
Scientific understanding can be roughly defined – as Chomsky once said – as a convergence between properties of the mind and properties of the world. What form does this convergence take?
Heisenberg is supposed to have said to Einstein that ‘If nature leads us to mathematical forms of great simplicity and beauty … to forms that no one has previously encountered, we cannot help thinking that they are “true”, that they reveal a genuine feature of nature’. He warned in Physics and Philosophy, discussing his major brainchild, that ‘we have at first no simple guide with correlating the mathematical symbols with concepts of ordinary language; and the only thing we know from the start is the fact that our common concepts cannot be applied to the structure of the atoms’. It follows from this that any scientist, be it natural, cognitive, or social, must distinguish between concepts of ordinary discourse (thought, water, shape) and theoretical constructs (dendrite, H2O, mass).
Contemporary physicists wonder whether macroscopic ‘objects’, which appear to follow classical principles, can also follow the laws of the quantum world. Recent research suggests that wave-particle duality, tunnelling, entanglement and coherence are not restricted to subatomic structures. Oxford physicist Vlatko Vedral says: ‘The impression that quantum mechanics is limited to the microworld permeates the public understanding of science’. But this convenient division is ‘a myth’. Quantum states are prone to collapse if their complexity reaches limits beyond the reaches of entanglement, but although quantum effects may be harder to detect at higher levels of complexity, this is not a reflection on the interaction of quantum systems themselves.
One of the most prominent examples is the role quantum entanglement plays within the electromagnetic fields of plant cells during photosynthesis. Electrons inside plant cells need to reach the chemical reaction centre to deposit their energy. But in the quantum world a particle can take all possible paths at once. The electromagnetic fields within plant cells can reinforce certain paths and cancel out others, drastically increasing the chances of an electron taking the maximally efficient path. The resulting quantum entanglement would only last for a fraction of a second, involving molecules with at most 100,000 atoms.
Jim Al-Khalili and Johnjoe McFadden’s 2014 study Life on the Edge: The Coming of Age of Quantum Biology details how aspects of the quantum world may explain certain properties of macroscopic bodies. To illustrate, the European robin has the ability to compute the direction and strength of the earth’s magnetic field. This common magnetoreceptive navigation method is ‘an enigma’, for the authors. Earth’s weak magnetic field must set off a biochemical reaction in the robin, but the energy supplied by this process is ‘less than a billionth of the energy needed to break or make a chemical bond’. Vedral likewise argues that the avian compass depends on a quantum-entangled radical pair mechanism, since the superposition and entanglement of the radical pair compass can last for tens of microseconds, potentially long enough for the robin to be directed in a particular direction.
Al-Khalili and McFadden reject the common view that quantum mechanics only plays a trivial role in biology. They invoke quantum entanglement as the process needed to explain the robin’s magnetoreceptive capacities. Entanglement occurs when two distant particles are non-locally connected by being part of the same quantum state. These states are lost when measured. Randomising factors in macroscopic objects (scattering and vibrations, etc.) cause the wave-like properties of particles to quickly dissipate, being factored as ‘measurements’.
Quantum phenomena, including tunnelling and superpositions, have been detected in biological processes ranging from photosynthesis to the production of biomolecules, while the forty-six supermolecules that make up DNA are unusually sensitive to quantum mechanical laws. A small but increasing number of researchers are publishing papers on quantum biology (QB), but they remain a minority. Those who reject QB could arguably be indirectly promoting a form of latter-day ‘vitalism’, or the view that biology cannot be reduced to chemistry and physics, whilst QB adherents appear to be reviving a form of 1920s ‘organicism’, which held biology to be governed by the same laws as all other matter.
Sticking to core topics in the biological sciences, it’s been noted that photosynthesis and respiration both share crucial features: humans ‘burn’ organic molecules to capture their electrons, while plants use light to ‘burn’ water to capture the electrons of H2O. The motions of particles involved in these processes is governed by quantum laws. Chlorophyll molecules have been shown to operate a search strategy term the ‘quantum walk’: a photon’s energy moves to the reaction centre through a photosynthetic complex termed the Fenna-Matthews-Olson (FMO) protein following multiple routes at the same time. Edward O’Reilly and Alexandra Olaya-Castro at UCL have shown that the exciton and its surrounding molecular vibrations share a single quantum of energy in a way which requires a quantum mechanical account, leading to ‘a role for non-trivial quantum phenomena in biology’. A recent review of QB poses a more general challenge, currently elusive: ‘[W]e must also account for how quantum subsystems at the nanoscale can depend on macroscale dynamics of organisms through evolution’.
There is also a growing consensus that QB may help discover whether there exists in nature more complex instances of persistent entanglement. Through the property of entanglement, electrons (and other particles of mass like buckyballs) interact, separate, and then behave (through spin or momentum) as if they were a single entity. Einstein termed this ‘spooky action at a distance’, regarding it as physically implausible. Yet the post-Newtonian world no longer contains a coherent notion of what physical is supposed to be; ‘physical’ just means whatever we come to understand in certain detail.
Relatedly, philosopher Galen Strawson notes that we know nothing about the physical which should lead us to doubt that experiential phenomena are wholly physical phenomena: ‘You might as well think that the efficacy of the binary system raises doubts about the validity of the decimal system’. Al-Khalili and McFadden are misguided, then, in claiming that the so-called mind-body problem ‘is surely the deepest mystery of our existence’. Newton taught us that bodies are not static, inert entities floating in space, and so we are quite within reason to conclude with Chomsky that matter ‘is no more incompatible with sensation and thought than with attraction and repulsion’.
In a discussion of physicalism and supervenience, Jeffrey Yoshimi notes that ‘physical systems aggregate into increasingly complex structures, existing at different levels of organization’. Like Heisenberg, Plato believed intelligibility was to be found only in the world of geometry and mathematics, with the world of sensation an unreal one. An effective study of astronomy, in his view, requires that ‘we shall proceed, as we do in geometry, by means of problems, and leave the starry heavens alone’.
Following the model of theoretical biology constructed by D’Arcy Thompson along with the emerging evo-devo program, an influential paper on the physical genesis of multicellular forms concludes that, ‘rather than being the result of evolutionary adaptation, much morphological plasticity reflects the influence of external physico-chemical parameters on any material system and is therefore an inherent, inevitable property of organisms’. These observations lead us to two central questions: (i) What explanatory power do basic physical laws have in accounting for biological complexity?; (ii) At what point do we need to invoke further complex process like natural selection? Traditionally, those who opted for physical law as their primary explanatory tool were termed ‘formalists’, and included such figures as D’Arcy Thompson, Richard Owen, Stuart Kauffman, Geoffroy St. Hilaire, Richard Goldschmidt, Nikolai Severtzov, Louis Agassiz and Goethe (whose plant studies led him to coin the term ‘rational morphology’). They focused on form and structural commonalities as their explanandum, leaving aside the question of adaptive effects as a secondary concern.
The modern Neo-Darwinian synthesis, heralded primarily by breakthroughs in Mendelian genetics, stands in opposition to formalism, and is typically associated with figures such as Richard Dawkins and Stephen Jay Gould. QB, then, follows the formalist tradition without acknowledging its existence, which perhaps is to be expected considering the widespread acceptance of Neo-Darwinism, adaptationism and functionalism in mainstream evolutionary biology. It is consequently not too surprising that the formalist Ludwig von Bertalanffy’s 1928 book Critical Theory of Morphogenesis influenced the quantum physicist and early quantum biologist Pascual Jordan.
Though not a formalist himself, L.T. Hobhouse (an early critic of the Boer War and the UK’s use of concentration camps in South Africa, which later inspired major Nazi figures) stressed in his 1901 study Mind in Evolution that the chaotic motion both of long grass and of ‘the white blood-corpuscle’ are ‘only very complicated results of the same set of physical laws in accordance with which the grass bows before the wind’. Newton, as Friedrich Lange explained in his classic History of Materialism, ‘had made the theory of some such universal attractive force necessary, by laying completely aside his unripe and vague conjectures as to the material cause of attraction, and kept strictly to what he could prove – the mathematical causes of the phenomena, supposing that there were some principle of approximation operation inversely as the square of the distance, let its physical nature be what it may’. In the case of Newton’s ‘occult force’ of gravity (as he conceived it), ‘the mathematical construction went ahead of the physical explanation, and on this occasion the circumstance was to attain a significance unsuspected by Newton himself’.
Since Newton, common sense understanding of space and causation have been thrown into disarray: ‘The status of causation’, John Collins points out, ‘has been moot ever since Newton’s impugnation of “hypotheses”, notwithstanding the common appeal to the notion as if it were the natural relation par excellence’. It may well be that what happened to notions like causation and space will also happen to quantum; that is, perhaps we will come to accept that the problems of quantum mechanics may be problems of the biology of language and mind.
Converging these views with a rejection of physicalism, Strawson writes: ‘I should admit, though, that I don’t fully know the nature of the physical. No one does. Nearly all of us take is that the physical is essentially spatio-temporal, for example, but no one expert in these matters claims to know for certain what space and time are, or whether they are really fundamental features of reality as we standardly conceive them’.
Chris Daly pointed out in 1998 that by lacking a concept of physical ‘no debate between physicalism and dualism can even be set up’. The post-Newtonian world simply does not entertain such ‘material’ notions. The conceptual implications are laid out by Steven Pinker: While quantum physics is infamously counterintuitive, ‘[w]hat is less appreciated is that classical Newtonian physics is also deeply counterintuitive. The theory in the history of physics that is closest to intuitive force dynamics is the medieval notion of impetus, in which a moving object has been imbued with some kind of vim or zest that pushes it along for a while and gradually dissipates’.
In 1737, Francesco Algarotti translated Newtonian physics for the unversed (specifically ‘the Ladies’, in his case) and acknowledged that ‘we are as yet but Children in this vast Universe, and are very far from having a [complete] Idea of Matter; we are utterly unable to pronounce what Properties are agreeable to it, and what are not’. For all the advances of QB, we may ultimately never be able to move beyond the situation described by Strawson: ‘I may also feel I understand – see – why this billiard ball does this when struck in this way by that billiard ball. But in this case there is already a more accessible sense in which I don’t really understand what is going on, and it is an old point that if I were to ask for and receive an explanation, in terms of impact and energy transfer, starting a series of questions and answers that would have to end with a reply that was not an explanation but rather had the form “Well, that’s just the way things are.”’
New paper published in Glossa on the ancient debate of whether language is optimally designed for cognition or communication.
Language design and communicative competence: The minimalist perspective
“In the Minimalist Program, the place of linguistic communication in language evolution and design is clear: It is assumed to be secondary to internalisation. I will defend this position against its critics, and maintain that natural selection played a more crucial role in selecting features of externalization and communication than in developing the computational system of language, following some core insights of Minimalism. The lack of communicative advantages to many core syntactic processes supports the Minimalist view of language use. Alongside the computational system, human language exhibits ostensive-inferential communication via open-ended combinatorial productivity, and I will explore how this system is compatible with – and does not preclude – a Minimalist model of the language system.”