The innovative contributions of generative grammar

A while ago, Roberta D’Alessandro proposed to her Facebook friends to say something positive about an approach that they otherwise criticize or reject, with the goal of making the interaction among linguists more positive. Since I often criticize generative approaches, I felt invited to say something positive about generative syntax, and here is what I came up with:

“The great contribution of generative syntax has been a shift of perspective: Instead of merely saying things about what we observe, linguists should build a model of what a speaker must know in order to show the behaviour she shows. This led to a tremendous broadening of the questions that were asked (at least in syntax), and to a new methodology – asking speakers for acceptability judgements – that proved extremely productive and powerful.”

Even though this was a brief Facebook comment, I still think it summarizes the innovations quite well, but let me elaborate a bit. I have seen quite a bit of pre-1957 linguistics, and two striking differences between the earlier and the newer works are:

(1) Informal experimentation: Before 1957 (Chomsky 1957), there was hardly any explicit syntactic experimentation – in the sense of asking speakers for acceptability judgements of constructed sentences (or morphological forms). There was also very little use of the asterisk for impossible sentences or forms (though the asterisk notation did not originate with generative grammar, as shown by Graffi 2002). I call the approach of classical generative grammar “informal experimentation”, because it was not based on rigorous experimental procedures as used in psychology. (This has occasionally been criticized, e.g. by Gibson & Fedorenko (2013), but for languages that do not have hundreds of researchers working on them, informal methods are very useful and revealing.)

(2) Complete syntactic paradigms: Before 1957, there was no ambition to formulate syntactic rules in such a way as to capture ALL the regularities of a language, i.e. everything that a speaker must know. Morphological paradigms were supposed to be complete in grammatical descriptions, but syntactic paradigms were often very partial. (Linguists of earlier generations must have elicited rarely occurring forms in order to complete their morphological paradigms, but it seems that they did not see syntax as consisting of paradigms that need to be completed. Generative grammarians did not use the term “paradigm” either, but they tried to present a complete picture of a subdomain of grammar.)

The methodological innovation of systematic experimentation, as well as the new ambition to be comprehensive and explicit along the lines of mathematical formalisms, seem to have been the main driving forces of the striking new findings of the late 1960s and 1970s. The fairly tight restrictions on the behaviour of anaphoric personal and reflexive pronouns, initially discussed under the heading of “pronominalization” (e.g. Langacker 1969), had hardly been noticed before the 1960s. The meanings of personal pronouns were always more or less obvious, and as long as linguists adopted the perspective of readers (often readers of ancient texts), they tended to overlook many regularities (Huang 2000 provides an accessible overview of a wide range of findings concerning anaphora).

The 1970s also saw the first thorough discussions of the behaviour of subjects, objects and of function-changing operations like passive and causative constructions, which later led to ambitious proposals such as Keenan (1976), Relational Grammar (e.g. Blake 1990), LFG treatments (e.g. Falk 2006), RRG (Van Valin 2005), and indirectly also to deeper functionalist theories such as those in Croft (2001). These works were “thorough” in the sense that they looked for a wide range of possible arguments/tests/diagnostics for the syntactic-function status of a given nominal, and once a new diagnostic was found, it was added to the armamentarium of subsequent research on other languages, often with very interesting results.

I was exposed to Relational Grammar in graduate school classes taught by Donna Gerdts in 1987-1988 at the University at Buffalo, and I have been impressed by the richness of the papers coming out of the Perlmutter/Keenan/Bresnan traditions ever since. (There has also been some really interesting broadly comparative work looking at many languages from around the world, such as Cinque (1999) and Baker (2008; 2015), but these works do not derive so many insights from the core innovations of generative grammar, I feel.)

And of course, generative grammar has also made pioneering discoveries in the domain of displacement constructions, especially question-word fronting, which turned out to be governed by very interesting regularities in all languages in which it is found (e.g. Ross 1967, Chomsky 1977). Again, these discoveries in individual languages laid the indispensable groundwork for later functionalist work that provided more comprehensive explanatory theories (Hawkins 1999; 2014).

(The above list does not exhaust the areas where the new approach has proved fruitful, of course, but this post should not get too long. Let me just note here that I do not see the same amount of new generative-inspired findings in simpler patterns, such as those of phonology, inflectional or derivational morphology, and syntactic coding; e.g. flagging/case-marking and indexing/agreement).

Now it is an interesting question whether the two key innovations of generative grammar – informal experimentation and complete syntactic paradigms – are closely related to Noam Chomsky’s larger programme of studying the human mind through language(s). It’s difficult to know, but it seems to me that other factors may have played a role that were quite independent of the peculiar mentalist and universalist perspective that drove generative grammar’s founder:

– The increasing importance and prestige of English: After WWII, the German and French languages (as well as Oxford and Cambridge) became much less important for American intellectuals in general, and it was more natural to be a serious scholar of language without knowing the prestige languages of earlier generations (which also included Latin and Greek, of course). While Greek, Latin and French had difficult inflectional paradigms that distracted from their syntactic paradigms, this distraction was absent in English, with its very rich syntactic paradigms (e.g. its rigid word order patterns in inversion and topicalization constructions).

– The decreasing importance of historical linguistics: While European linguists focused on the histories of their national languages since the mid 19th century, this impetus was much less present in the United States, because it was a new nation that did not have or need an ancient history as ideological support. This led to a greater interest in all aspects of the synchronic language(s), including syntax (which was not well suited for historical study).

– The rise of intelligent machines: By the 1950s, the idea of a thinking and speaking machine was not only imaginable, but seemed within reach, because machines became more and more sophisticated, allowing space travel, traffic regulation and automation of production at an unprecedented scale. Mathematicians like Alan Turing and engineers like Claude Shannon had laid the foundations for a new range of technologies, so the idea of thinking of human language in a similar way was no longer far-fetched.

Thus, even though the innovations of generative grammar are intimately linked with the name of a single scholar (Noam A. Chomsky), the factors that made his ideas popular seem to have existed independently. And I think that there is no question that the key innovations (experimentation and complete paradigms) are hugely important methodological contributions that will outlast Chomsky and the current “generative grammar” tradition(s), and most of the particular ideas about an innate universal grammar that have been suggested along the way.

References

Baker, Mark C. 2008. The syntax of agreement and concord. Cambridge: Cambridge University Press.

Baker, Mark C. 2015. Case: Its principles and parameters. Cambridge: Cambridge University Press.

Blake, Barry J. 1990. Relational grammar. London: Routledge.

Chomsky, Noam A. 1957. Syntactic structures. ’s-Gravenhage: Mouton.

Chomsky, Noam A. 1977. On Wh-Movement. In Adrian Akmajian, Peter W. Culicover & Thomas Wasow (eds.), Formal syntax, 71–132. New York / Dublin: Academic Press. https://ci.nii.ac.jp/naid/10009706728/

Cinque, Guglielmo. 1999. Adverbs and functional heads: A cross-linguistic approach. New York: Oxford University Press.

Croft, William. 2001. Radical construction grammar: Syntactic theory in typological perspective. Oxford: Oxford University Press.

Falk, Yehuda N. 2006. Subjects and Universal Grammar: An explanatory theory. Cambridge: Cambridge University Press.

Gibson, Edward & Evelina Fedorenko. 2013. The need for quantitative methods in syntax and semantics research. Language and Cognitive Processes 28(1–2). 88–124. doi:10.1080/01690965.2010.515080.

Graffi, Giorgio. 2002. The asterisk from historical to descriptive and theoretical linguistics: An historical note. Historiographia Linguistica 29(3). 329–338. doi:10.1075/hl.29.3.04gra.

Hawkins, John A. 1999. Processing complexity and filler-gap dependencies across grammars. Language 75. 244–285.

Hawkins, John A. 2014. Cross-linguistic variation and efficiency. New York: Oxford University Press.

Huang, Yan. 2000. Anaphora: A cross-linguistic study. Oxford: Oxford University Press.

Langacker, Ronald W. 1969. On pronominalization and the chain of command. In David A. Reibel & Sanford A. Schane (eds.), Modern studies in English: Readings in Transformational Grammar, 160–186. Englewood Cliffs, NJ: Prentice-Hall.

Ross, John Robert. 1967. Constraints on variables in syntax. Cambridge, MA: MIT PhD dissertation. https://eric.ed.gov/?q=ross+syntax&id=ED016965.

Van Valin, Robert D., Jr. 2005. Exploring the syntax-semantics interface. Cambridge: Cambridge University Press.



Cite this blog post
Martin Haspelmath (2019, March 13). The innovative contributions of generative grammar. Diversity Linguistics Comment. Retrieved March 28, 2024, from https://doi.org/10.58079/nsur

One thought on “The innovative contributions of generative grammar

  1. 1) I think you are right when you say that the factors that made Chomsky’s ideas popular existed independently. Probably linguistics would have gone in this direction sooner or later even without him. Much was already in the air. The idea of generating well-formed expressions by algorithmic rules had existed for quite some time in logic already and there were serious attempts to apply it to natural language quite early, as in categorial grammar, which was developed in the 1930’s.

    2) What you say about the lack of interest in the behaviour of pronouns may be a bit exaggerated. In his normative handbook *Riktig svenska* ‘Proper Swedish’ from 1939, the Swedish linguist Erik Wellander devotes almost 20 pages to the choice between reflexive and non-reflexive pronouns, with an insightful discussion of syntactic and semantic factors. He does not cite any sources, so it is not clear how much of this was his own contribution.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.