Linguistics / Review Essay

Vol. 6, NO. 2 / August 2021

This essay is the first part of a series on classic texts that have come to be seen as landmark achievements in their fields.

Noam Chomsky published Aspects of the Theory of Syntax in 1964.1 The publication of Syntactic Structures in 1957 had already sounded like the roll of distant thunder. A natural language could be studied at the level of explicitness and rigor common in mathematical logic.2 A revolution was in prospect.3 Having heard thunder, linguists were eager to see lightening. They were not disappointed. Aspects consolidated the revolution. Old-fashioned linguists and behavioral psychologists were scattered into exile.

In undertaking a revolution, Chomsky did what revolutionaries often do. He created his own predecessors, Plato and René Descartes among them. Reviving the notion of Universal Grammar from the seventeenth-century Port-Royal grammarians, Chomsky argued that since every human child could learn any human language, a single abstract grammatical system must be the common property of the human race. Syntactic Structures had offered linguists a theory in the sense understood by the serious sciences. In Aspects, the offer was carried forward and justified. Writing almost thirty years later, David Pesetsky struck just the right note:

The linguistic capacity of every human being is an intricate system [emphasis added], full of surprises but clearly law-governed [emphasis added], in ways that we can discern by scientific investigation [emphasis added]. Though we still have much to learn about this system, a great deal has been discovered already.4

These are ideas that, in Aspects, Chomsky compelled some linguists to accept: that many have accepted them is a measure of the book’s importance.  

An Acquisition of the Species

A  discussion of human creativity typically proceeds from a handful of examples: Aristotle, William Shakespeare, Isaac Newton, Wolfgang Amadeus Mozart, Albert Einstein. Whatever the list, and no matter its length, it embodies the assumption that human creativity is in short supply. All honor to the geniuses, if only because they are rare. Noam Chomsky’s very greatest contribution to thought has involved turning this assumption on its head. Human creativity is an acquisition of the species, the common property of the human race. By virtue of having mastered a natural language—Pesetsky’s intricate system—every human being is in possession of a rich, complex, and creative system of thought.

In Syntactic Structures, Chomsky identified creativity with the recursive structure of a natural language. The human faculty of language is unbounded in precisely the way that the natural number system is unbounded. It is always possible to extend a sentence, as when the cat is on the mat is enlarged to encompass John believes that the cat is on the mat, and it is possible to do this without obvious limit. In making this possibility the gravamen of his concerns, Chomsky revived Wilhelm von Humboldt’s view that language “must make infinite use of finite means.”5 If this is what language does, until the development of the theory of recursive functions in the first four decades of the twentieth century, no one knew how it was done. Chomsky had read and studied the masters: Kurt Gödel, Alan Turing, Alonzo Church, and, above all, Emil Post.6 They gave him a theory, and in Syntactic Structures he made use of it.

He was the first linguist to do so.

Following the publication of Syntactic Structures, Chomsky enlarged this idea of linguistic creativity by appealing to his Cartesian camouflage: “[O]ne fundamental contribution of what we have been calling ‘Cartesian linguistics,’” he wrote,

is the observation that human language, in its normal use, is free from the control of independently identifiable external stimuli or internal states and is not restricted to any practical communicative function, in contrast, for example, to the pseudo language of animals.7

This is a large and dramatic claim because it assigns to the ordinary use of language an aspect of human freedom. Thoughts and their expression in language are inclined by circumstances but they are not impelled by them: they are free both from “the control of independently identifiable external stimuli” and “internal states.” If this is a claim with overwhelming intuitive plausibility, there is no underestimating its radical nature. It exalts human creativity, but in doing so, places it beyond the scope of the physical sciences as they are now understood. About this kingdom, as Chomsky recognized, modern science has virtually nothing to say.

Competence and Performance

The true and proper object of linguistic theory, Chomsky argued in Aspects, is the competence of a native speaker—what he knows and not what he says.8

Linguistic theory is concerned primarily with an ideal speaker-listener, in a completely homogeneous speech-community, who knows its language perfectly and is unaffected by such grammatically irrelevant conditions as memory limitations, distractions, shifts of attention and interest, and errors (random or characteristic) in applying his knowledge of the language in actual performance.9

A speaker’s performance is compromised by limitations of memory, hesitations, repetitions, and any number of throat clearings or verbal tics. The object of linguistic theory is the generative system that accounts for a native speaker’s competence; and not the use of this scheme by systems of parsing and production. This, at once, raised a profound and difficult question: if the performance of a native speaker—what he says—is compromised in various ways, how might he have acquired the underlying system of rules that makes his performance possible? Linguists find the task very difficult, and it is, even today, by no means complete for any natural language. It is hardly possible that children perform a remarkable inductive feat on being presented with data that are compromised and thus degenerate, and under circumstances that are characterized by what Chomsky, with his gift for memorable formulations, called the poverty of the stimulus. Having posed the problem, Chomsky also proposed its solution:

The problem for the linguist, as well as for the child learning the language, is to determine from the data of performance the underlying system of rules that has been mastered by the speaker-hearer and that he puts to use in actual performance … The grammar of a particular language, then, is to be supplemented by a universal grammar that accommodates the creative aspect of language use and expresses the deep-seated regularities which, being universal, are omitted from the grammar itself.10

The goal of linguistic theory is to provide a theory rich enough to describe any human language by principles general enough to apply to every one of them. Unless such a theory exists, there could be no accounting for the fact that human languages are all learnable.

A generative grammar is a system of rules that assigns structural descriptions to sentences.11 There is no end to sentences and no end to their structural descriptions. The generative grammar represents the linguist’s theory, but it also represents the adult speaker’s tacit linguistic knowledge.

It represents both.

The Standard Theory

Aspects presented linguists with what, at once, became the Standard Theory. Syntactic Structures had already offered the essentials. A grammar of a natural language comprises phrase structure and transformational rules. Phrase structure rules break sentences into constituents, the process ultimately yielding a terminal string in which constituents no longer contain constituents. These rules generate hierarchical structures or phrase markers—tree diagrams, in fact. Transformational rules, on the other hand, map phrase markers onto phrase markers. Transformational rules had, in fact, been introduced by Chomsky’s mentor, Zellig Harris, but in Syntactic Structures they were, for the first time, embedded in a purely formal context.

In Aspects, the ideas found first in Syntactic Structures found themselves amplified. The Standard Theory is a computational system. Rules are formal because they are explicitly specified: there is no appeal to meaning. The grammar consists of syntactic, semantic, and phonological components; and in addition it contains, or makes use of, a lexicon, something like a formal dictionary.12 Syntax is under the control of phrase structure and transformational rules. Phrase-structure rules are formulated as context-free rewriting rules.13 A category symbol A, where A might designate S (for sentence), is dissected into a string Z of one or more symbols: A → Z/ X_Y, where the context afforded by X and Y is null. The symbols themselves may represent lexical categories, such as noun (N) or verb (V); syntactic categories such as sentence (S); and syntactic constituents such as noun and verb phrases (NP and VP).

The grammar also contains context-sensitive rules: A → Z/ X_Y, where X or Y are not null. These rules serve to insert lexical items into phrase markers.14 It matters a great deal where they are inserted. Morris plays lapta is fine: not so Lapta plays Morris. The appeal to context is ineliminable. Context-free and context-sensitive rules generate the phrase marker underlying sentences: [S [NP [N]] [VP V [NP [N]]] is an example drawn down to the level of syntactic categories; and on lexical insertion, there is [S [NP Morris] [VP plays [NP lapta]]. From these phrase markers, it is possible to recover old-fashioned grammatical functions—the fact that Morris is the subject of the sentence in which he is playing lapta. Functions are treated as two-place relations: x is the subject of y. These functional relationships may be seen in plain sight on the phrase marker itself, with one node marking the subject of a sentence, and another, its object. The result is what Aspects, in a phrase now famous, called deep structure. Transformational rules then map deep structures onto surface structures—those structures ready to enter the gabble of communication.

Chomsky electrified the community of linguists by persuasively arguing that the surface structures of a natural language are no good guide to its deep structures and, indeed, the distinction between deep and surface structures was widely appreciated as one of the theory’s greatest insights. In insisting on the distinction, and its importance, Chomsky appealed to brilliantly chosen examples. In Syntactic Structures, he had introduced the now famous sentence Colorless green ideas sleep furiously in order to demonstrate that there exist perfectly grammatical English sentences that don’t mean a thing. It followed that syntax and semantics were independent; a large conclusion derived from a small example. In Aspects, examples multiplied. The sentences John is easy to please and John is eager to please are on their surface very similar, differing as they do in only one word and otherwise conforming to the same grammatical pattern: NP Cop Adj to VP. Appearances are misleading. These sentences are not at all similar. From John is easy to please it follows that it is easy to please John, but nothing like this follows from John is eager to please. On the other hand, John’s eagerness to please follows from the fact that John is eager to please, but there is nothing like John’s easiness to please, even though it is easy to please John. These two sentences are radically different. It is on the level of deep structure that these differences are evident. In arguing in this way with respect to a great many examples, Chomsky was making specific points, but he was also doing more. He was introducing linguists to a new style of argument.

Recursion Redux

Recursion figured prominently in Syntactic Structures. Syntactic rules can refer back to themselves and thus may apply to their own outputs. In Aspects, sentences themselves became objects of recursive looping and replaced certain transformational rules. This was a major technical development. A sentence (S) may be dissected into a noun phrase and a verb phrase

S → NP VP.

Well and good. A noun phrase may now be dissection into a noun phrase and a sentence

NP NP S.

A verb phrase may then be dissected into a verb and a noun phrase

VPV NP.

And in view of NP NP S, into a verb, a noun phrase, and a sentence in virtue of NP NP S. This makes possible the generation of structures such as

[S John [S who met Mary] knows Sue],

as well as

[S the linguist [S that met the mathematician
[S that knows the student [S that… .].

The introduction of sentential recursion, with S hanging on for dear life from both sides of a phrase marker, introduced a notable economy into the Standard Theory. Syntactic Structures had handled the matter by hand, inserting sentential phrasemakers in other sentential phrase markers. Fewer symbols were now required, the derivation of complex clauses simplified, the theory streamlined.

With recursion, there is in Aspects, a return to the creativity of language:

The infinite generative capacity of the grammar arises from a particular formal property of these categorical rules, namely that they may introduce the initial symbol S into a line of a derivation. In this way, the rewriting rules can, in effect, insert base Phrase-markers in other base Phrase-markers, this process being iterable without limit.15

Ineliminable Transformations

The Standard Theory offered linguists a formal structure with two quite different kinds of formal rules—phrase-structure and transformational. Recursion got rid of some transformations, but not all. The resulting structure is, if not inelegant, then, at least, somewhat clumsy. Why two? Empirical justifications for transformational rules arose from the mismatches between deep and surface structures. The passive voice is an example. In a passive sentence, the logical object of a verbal predicate occurs in the subject position. John was convinced by Bill to leave consists of two sentences

(S): [S John was convinced by Bill [S _ to leave]].

John is the grammatical subject of the main sentence, but not its logical subject, which is Bill. On the other hand, Bill is not the logical subject of the embedded sentence, which is John.

Transformational rules apply from the embedded constituent of a sentence to its outermost constituent. They can insert, erase, substitute, and reorder linguistic constituents. The passive transformation is again an example:

NP1 V NP2 ⇒ NP2 be + V-ed by + NP1.

This transformation applies to a phrase maker consisting of a nominal constituent NP1 followed by a verb (V), itself followed by second distinct nominal constituent NP2. The transformation specifies the result of this operation: NP1 and NP2 are reordered, the auxiliary be is added to V as well as the passive morphology –ed, and the preposition by is added to the postposed NP1.

None of this can be handled by phrase structure rules, unless the phrase-structure rules are themselves allowed to increase without limit. If transformational rules are ineliminable within the context of phrase-structure grammars, they seemed, nevertheless, to carry just something of the arbitrary. It is therefore one of the ironies of intellectual history that, far from being purged in theoretical syntax, it has been the other way around, with phrase structure rules themselves dwindling in favor of transformational rules in the minimalist program.

An Old-Fashioned

Beyond its obvious contribution to syntactical theory, Aspects offered linguists a rich and subtle analysis of old-fashioned grammatical categories—noun, verb, adjective, adverb, and the like. Although obviously answering to something, these categories were never clearly defined. A noun was traditionally defined as an expression designating a person, place, or thing. The definition is obviously inadequate. In the sentence Luck is a great virtue, “luck” is a noun but not one designating a person, place, or thing. There are many other examples. Making use of a technique first introduced by Roman Jakobson, Chomsky purged these didactic definitions in favor of a scheme in which each syntactic category was flagged by a finite set of binary-valued features. The word dog thus enters the lexicon marked as [+N] for noun; the word barks by [+V] for verb. Neither [+N] nor [+V] receives any further definition, but they do determine how lexical categories behave.16 Their meaning is in their use, as Ludwig Wittgenstein remarked, and their use is governed by their rules, the rules in turn governed by their features. These features serve to discriminate transitive verbs such as frighten from intransitive verbs such as sleep. Both frighten and sleep are specified with an inherent [+V] feature: they are both verbs; but frighten, contrary to sleep, is specified by a trailing [+N]. It takes an object. The introduction of categorical selection rules—what goes where—ensures that verbs like frighten are inserted in a phrase marker in the context of a nominal constituent ([+N]), while verbs like think are not. The professor frightens the boy is grammatical. The professor thinks the boy is not.

Chomsky also proposed to distinguish between categorical and semantic selectional features. A verb like frighten requires a [+ animate] object; not so, a verb such as praise. The sentence The professor frightens sincerity is grammatical, even though it is semantically deviant, whereas The professor praises sincerity is grammatical and otherwise just fine.17 The introduction of contextual selection rules ensures that frighten is inserted in the phrase marker in the structural context of a [+N] [+animate] object.

In developing his theory of syntactic features, Chomsky was heeding methodological constraints: he was responding to the imperative to keep his theory simple. Context-sensitive rules could well be used to settle the distinctions between frighten and sleep, but only by adding complexity to the grammar. The introduction of syntactic features is one of the most important contributions of Aspects.18 It leads to one of Chomsky’s boldest and most dramatic conclusions. The lexicon of a natural language, with its constituents flagged by various syntactic, semantic, and phonological features, is the very place where one language is unlike another. Beyond the lexicon, every human language is governed by the same structures of universal grammar, and in this sense, Chomsky argued, there is only one human language.

One human language! This is surely among the most provocative and dramatic claims of the last half century.

First Principles

Linguistic theory aims to derive linguistic facts from first principles, an ultimate goal linguistics shares with science. What would these principles be for language? We point to one universal principle stemming from the Standard Theory: the structure dependency of syntactic rules. Thus S goes over to NP and VP. NP and VP are sister nodes, both structural dependents of S. Ditto for NP → Det N and VP → V NP. The top-down application of the rewriting rules generates structural dependencies between syntactic constituents. The rule governing relative clauses rewrites an NP into an [NP NP S]. This rule ignores the linear position of the NP. Relative clauses can be generated both in subject position [S The student of physics [S who met your advisor] is in my class], and object position, e.g. [I know the student of physics [S who met your advisor]]. A relative clause modifies an NP and not the embedded nominal constituent within that NP. The relative clause [S who met your advisor] does not modify the nominal constituent [physics], even though this nominal constituent immediately precedes it. Structural dependency is a first principle of the language faculty. Linear order is not.

Transformational rules, as defined in the Standard Theory, are structure dependent and they apply to the structural description of a sentence, specify the structural changes, and derive the resulting transformed structure. Transformations may also be associated with conditions on their application. For example, certain transformations apply to main clauses but not to embedded clauses. This is the case for closed yes or no questions. This transformation applies to the underlying structure of sentences such as [S John is here] and yields the underlying structure [S Is John here]. Even though these examples seem to indicate that this transformation relies on surface linearity, inverting the auxiliary and the immediately preceding nominal constituent, the following example includes a more complex subject: [S [NP The professor of John] is here], and illustrates that this transformation is in fact structure dependent. If it were not the case, this transformation could apply to the auxiliary and the immediately preceding nominal constituent John, yielding [[The professor of is John] here]. Instead, this transformation applies to the full NP structure and yields [S Is [NP the professor of John] here]. It might very well be the case that structure dependency of syntax is rooted in language design and so a first principle of the language faculty.

Why? No one knows.

Open Questions

Aspects left open several questions for further inquiry. Alternative hypotheses are considered in Aspects, including with respect to the relevant levels of representation, the properties of the syntactic rules, and the principles of Universal Grammar.19 These questions have been investigated in the course of the development of generative grammar. The discovery that syntactic rules apply across categories led to the elimination of the multiple rewriting rules postulated in Aspects, in favor of a general rule schemata in Government and Binding theory. Transformational rules were reduced to two general operations: move NP (displacing nominal constituents), and move wh- (displacing operators such as who, what, where, when, in open question formation). In the minimalist program,20 syntactic operations are reduced to Merge (xy), where x and y are two syntactic objects. Current work investigates the consequences of distinguishing Set Merge, a symmetrical operation deriving unordered sets of constituents, from Pair Merge, an asymmetrical operation deriving ordered sets of constituents.

Another interesting question left open in Aspects is whether syntactic rules yield the linear order of syntactic constituents, as in the Standard Theory, where John eats flies, or whether they leave the constituents they combine unordered, as in the set {John, eats, flies}, which is on set-theoretical grounds identical to {eats, John, flies}. The minimalist program investigates, and, indeed, champions the second hypothesis. The linearization of syntactic constituents is handled by the phonological component of the grammar. The very deepest operations of the human mind are indifferent to what might appear to be the most fundamental fact about human language—that words follow one another in a particular order. In all of these arguments, a greater, grander argument is always at work. Universal Grammar must account for the rapid emergence of language in the species, and it must account for its rapid acquisition in the individual. Nothing less than radical simplicity can serve either goal.

Influence Beyond Linguistics

By defining the object of inquiry of linguistic theory as internal to the mind, linguistic theory led to the creation of a new interdisciplinary field of inquiry devoted to the study of the biological basis of language, the so-called Biolinguistic Program.21 Recent research confirms the importance of generative grammar for an understanding of the language faculty as a specifically human trait.22 The language faculty, like other biological systems, is genetically rooted. Under normal conditions, it develops very early in the child without conscious efforts or extensive training. Animals cannot learn a human language, much to their regret and ours. Monkeys can spontaneously master the weakest of finite-state grammars, but they cannot reach the context-free grammars, which are characteristic of human language, and hierarchical structures are, for this reason, beyond them.23

Nothing in the neurosciences is yet as subtle and detailed as the Standard Theory, but it has been established that Broca’s area supports the processing of syntax. Human beings are programmed to compute linguistic recursion. A part of Broca’s area would appear dedicated to complex syntactic structures: Brodmann area 44 is activated for center embeddings, and Brodmann area 45, adapted to movement.24 Other studies in cognitive neurosciences indicate that the human brain is sensitive to structure-dependent computation when processing language. This is the case for sentence processing as well as for the processing of phrasal constituents.25 Yet other studies indicate that the brain processes deep structures, largely ignoring their surface form.26 “Linguistic theory is mentalistic,” Chomsky wrote somewhat defiantly, “since it is concerned with discovering a mental reality underlying actual behavior.”27 Linguistic theory is still mentalistic, but step-by-step, research is uncovering its physical roots in the neurophysiology of the human brain.

An Enduring Legacy

Aspects introduced a revolution within linguistics. The subject has never been the same again. It promoted linguistics into a science, one that accepted the methods and the standards of the serious sciences themselves. It did more. It championed an integrated study of organic systems, an interdisciplinary field of inquiry bridging results from linguistics and other sciences. And it did still more. It achieved what only the most profound of scientific revolutions achieves and that is transformation of what initially seemed outrageous to what currently seems commonplace. Children do learn their native language without effort or instruction; a human language is a system of dazzling and poorly understood complexity; some things must be innate if anything is ever to be acquired; there is a distinction between competence and performance; the most robust system of assessment in studying grammar is a native speaker’s intuitions; and the ability of every human being to use his language for creative means is a mystery that we have not penetrated and may never understand.28

Endmark

  1. Noam Chomsky, Aspects of the Theory of Syntax (Cambridge, MA: MIT Press, 1965). 
  2. The mathematical basis of generative grammar was published in different articles including Noam Chomsky, “Three Models for the Description of Languages,” IEEE Transactions on Information Theory 2, no. 3 (1956): 113–24, doi:10.1109/tit.1956.1056813; and George Miller and Noam Chomsky, “Finitary Models of Language Users,” in Handbook of Mathematical Psychology, vol. 2, ed. Duncan Luce, Robert Bush, and Eugene Galanter (New York: Wiley, 1963), 419–91. See also Thomas Bever, “The Cognitive Basis for Linguistic Structures,” in Cognition and the Development of Language, ed. John Hayes (New York: Wiley & Sons, 1970), 277–360. 
  3. Transformational generative grammar stands in contradistinction with the structuralist-behaviorist paradigm, prevalent in the first half of the twentieth century. By targeting surface phenomena, structuralist grammars were inevitably drawn to listing exceptions and irregularities instead of capturing language regularities and generalizations. Structuralist grammars were not concerned with Universal Grammar, the human-specific trait enabling the child’s ability to develop language naturally, and they endorsed the behaviorist view of language according to which language is acquired by general mechanisms such as induction, analogy, training, and reinforcement. See Chomsky’s famous review: Noam Chomsky, “Review of Skinner’s Verbal Behavior,” Language 35 (1959): 26–58, doi:10.2307/411334. See also Eric Lenneberg, “On Explaining Language,” Science 164, no. 3,880 (1969): 635–43, doi:10.1126/science.164.3880.635. 
  4. David Pesetsky, “Forecast: Sunny with Scattered Annoyances, but with a Chance of Storms (Recommended Action: Very Basic Linguistics Education),” paper presented at the conference Generative Syntax in the Twenty-First Century: The Road Ahead, Athens, Greece, May 28–30, 2015, 1. 
  5. Wilhelm von Humboldt, “Ueber das vergleichende Sprachstudium in Beziehung auf die verschiedenen Epochen der Sprachentwicklung (On the Comparative Study of Language and its Relation to the Different Periods of Language Development),” in Gesammelte Schriften, vol. 7 (Berlin: Behr, 1907), 98–99. Translation by the editors. 
  6. For a detailed examination of Post’s work and significance, see Allyn Jackson, “Emil Post: Psychological Fidelity,” Inference: International Review of Science 4, no. 2 (2018), doi:10.37282/991819.18.48. 
  7. Noam Chomsky, Cartesian Linguistics (Cambridge, UK: Cambridge University Press, 2009), 76. 
  8. In B. F. Skinner’s Verbal Behavior, the distinction is missing, one reason that Chomsky dismissed his theories with disdain. To accept the distinction is almost at once to reject behaviorism in psychology. B. F. Skinner, Verbal Behavior (Hoboken, NJ: Prentice-Hall, Inc., 1957). 
  9. Chomsky, Aspects of the Theory of Syntax, 3. 
  10. Chomsky, Aspects of the Theory of Syntax, 4, 6. 
  11. Chomsky writes: “The term ‘generate’ is familiar in the sense intended here in logic, particularly in Post’s theory of combinatorial systems.” Chomsky, Aspects of the Theory of Syntax, 9. 
  12. Chomsky, Aspects of the Theory of Syntax, 84. 
  13. A generative grammar is a computational system analogue to an automaton, with the capacity to read, write, and display information. Chomsky defines a hierarchy of formal grammars, ranked according to their increasing generative complexity (Chomsky, “Three Models”). Each grammar is associated with an automaton of equivalent capacity. It has been established that the generative capacity of a grammar to describe human language must at least have the capacity of phrase structure grammars. See also Noam Chomsky, “On Certain Formal Properties of Grammars,” Information and Control 2, no. 2 (1959): 137–67, doi:10.1016/s0019-9958(59)90362-6; Noam Chomsky, “Formal Properties of Grammars,” in The Handbook of Mathematical Psychology, vol. 1, 323–418. 
  14. For example, verbs such as see and meet select an NP object and an animate NP subject, which is not the case for verbs such as grow
  15. Chomsky, Aspects of the Theory of Syntax, 142. 
  16. In Chomsky’s essay “Remarks on Nominalization,” the binary syntactic features [±N] and [±V] are used to define the major syntactic categories, N: [+N, –V], V: [–N, +V], ADJ: [+N, +V], P: [–N, –V]. This allows for the identification of natural classes of categories, that is, categories undergoing the same syntactic operations, by using a formal property of binary feature systems. See Noam Chomsky, “Remarks on Nominalization,” in Readings in English Transformational Grammar, ed. Roderick Jacobs and Peter Rosenbaum (Waltham, MA: Ginn, 1970): 184–221. 
  17. The autonomy of syntax with respect to semantics is established in Syntactic Structures on the basis of Chomsky’s example colorless green ideas sleep furiously. While being semantically deviant, such sentences are generated by the grammar of English. This is not the case for ungrammatical sentences such as *colorless sleep furiously ideas green
  18. See, among other works, David Pesetsky and Esther Torrego, “The Syntax of Valuation and the Interpretability of Features,” in Phrasal and Clausal Architecture, ed. Simin Karimi, Vida Samiian, and Wendy Wilkins (Amsterdam: John Benjamins, 2007), 262–94. 
  19. Another question left open in Aspects is the distribution of labor between components of the grammar, including the lexicon, the syntax, and the semantic components. As mentioned previously, alternative hypotheses are considered in Aspects, along with their consequences. These alternatives have been investigated in the course of the development of generative grammar, giving rise to further questions and problems to solve. One question is whether derived nominals, such as destruction and refusal are derived by syntactic rules, on a par with gerundive nominals, such as destroying and refusing, or subject to combinatorial rules in the lexicon or in a distinct morphological component of the grammar. See, for example, Chomsky, “Remarks on Nominalization”; Mark Aronoff, Morphology by Itself (Cambridge, MA: MIT Press, 1996); Anna Maria Di Sciullo and Edwin Williams, On the Definition of Word (Cambridge, MA: MIT Press, 1987); Morris Halle and Alec Marantz, “Distributed Morphology and the Pieces of Inflection,” in The View from Building 20, ed. Ken Hale and Samuel Keyser (Cambridge, MA: MIT Press, 1993), 111–76. 
  20. Noam Chomsky, The Minimalist Program (Cambridge, MA: MIT Press, 1995). Noam Chomsky, Angel Gallego, and Dennis Ott, “Generative Grammar and the Faculty of Language: Insights, Questions, and Challenges,” Catalan Journal of Linguistics (2019), doi:10.5565/rev/catjl.288. 
  21. Noam Chomsky, “On Minds and Language,” Biolinguistics 1 (2007): 9–27; Massimo Piattelli-Palmarini, Juan Uriagereka, and Pello Salaburu, Of Minds and Language: A Dialogue with Noam Chomsky in the Basque Country (Oxford: Oxford University Press, 2009), 469–72; Anna Maria Di Sciullo et al., “The Biological Nature of Human Language,” Biolinguistics 4, no. 1 (2010): 4–34; Robert Berwick and Noam Chomsky, “The Biolinguistic Program: The Current State of Its Development,” in The Biolinguistic Enterprise, ed. Anna Maria Di Sciullo and Cedric Boeckx (Oxford: Oxford University Press, 2011): 19–41; Anna Maria Di Sciullo and Lyle Jenkins, “Biolinguistics and the Human Language Faculty,” Language 92, no. 3 (2017): e205–36, doi:10.1353/lan.2016.0056; Noam Chomsky, “The Language Capacity: Architecture and Evolution,” Psychonomic Bulletin and Review 24, no. 1 (2017): 200–203, doi:10.3758/s13423-016-1078-6; Anna Maria Di Sciullo, ed., Biolinguistics: Critical Concepts in Linguistics, vols. 1–4 (New York: Routledge, Taylor and Francis, 2017).  
  22. Karin Stromswold, “Genetics and the Structure, Acquisition and Evolution of Language,” paper presented at Biolinguistic Investigations, Santa Domingo, Dominican Republic, February 24, 2007; Karin Stromswold, “The Genetics of Speech and Language Impairments,” New England Journal of Medicine 359, no. 22 (2008): 2,381–83; Karin Stromswold, “Genetics and the Evolution of Language: What Genetic Studies Reveal about the Evolution of Language,” in The Evolution of Human Language: Biolinguistic Perspectives, ed. Richard Larson, Viviane Déprez, and Hiroko Yamakido (Cambridge, MA: Cambridge University Press, 2010), 176–90, doi:10.1017/CBO9780511817755.013. 
  23. See, among other works, Tecumseh Fitch and Marc Hauser, “Computational Constraints on Syntactic Processing in a Nonhuman Primate,” Science 303, no. 5,656 (2003): 377–80, doi:10.1126/science.1089401. 
  24. Michiru Makuuchi et al., “Segregating the Core Computational Faculty of Human Language from Working Memory,” Proceedings of the National Academy of Sciences 106, no. 20 (2009): 8,362–67, doi:10.1073/pnas.0810928106; Yosef Grodzinsky and Andrea Santi, “Working Memory and Syntax Interact in Broca’s Area,” NeuroImage 37, no. 1 (2007): 8–17, doi:10.1016/j.neuroimage.2007.04.047; Yosef Grodzinsky and Andrea Santi, “fMRI Adaptation Dissociates Syntactic Complexity Dimensions,” NeuroImage 51, no. 4 (2010): 1,285–93, doi:10.1016/j.neuroimage.2010.03.034. 
  25. Andrea Moro et al., “Syntax and the Brain: Disentangling Grammar by Selective Anomalies,” NeuroImage 13, no. 1 (2001): 110–18, doi:10.1006/nimg.2000.0668; Esti Blanco-Elorrieta and Liina Pylkkänen, “Composition of Complex Numbers: Delineating the Computational Role of the Left Anterior Temporal Lobe,” NeuroImage 124 (2016): 194–203, doi:10.1016/j.neuroimage.2015.08.049. 
  26. Christos Pliatsikas et al., “Processing of Zero-Derived Words in English: an fMRI Investigation,” Neuropsychologia 53 (2014): 47–53, doi:10.1016/j.neuropsychologia.2013.11.003. See also Angela Friederici, Language in Our Brain: The Origins of a Uniquely Human Capacity (Cambridge, MA: MIT Press, 2020), for a summary of the results on the neurobiological foundations of language indicating that species-specific brain differences may be at the root of the human capacity for language. 
  27. Chomsky, Aspects of the Theory of Syntax, 4. 
  28. I gratefully acknowledge lively feedback from David Berlinski. 

Anna Maria Di Sciullo is Professor of Linguistics at the University of Quebec at Montreal.


More on Linguistics


Endmark

Copyright © Inference 2024

ISSN #2576–4403