Jump to content

Universal grammar

From Wikipedia, the free encyclopedia

Universal grammar (UG), in modern linguistics, is the theory of the innate biological component of the language faculty, usually credited to Noam Chomsky. The basic postulate of UG is that there are innate constraints on what the grammar of a possible human language could be. When linguistic stimuli are received in the course of language acquisition, children then adopt specific syntactic rules that conform to UG.[1] The advocates of this theory emphasize and partially rely on the poverty of the stimulus (POS) argument and the existence of some universal properties of natural human languages. However, the latter has not been firmly established.

Other linguists have opposed that notion, arguing that languages are so diverse that the postulated universality is rare.[2] The theory of universal grammar remains a subject of debate among linguists.[3]

Overview

[edit]

The term "universal grammar" is placeholder for whichever domain-specific features of linguistic competence turn out to be innate. Within generative grammar, it is generally accepted that there must be some such features, and one of the goals of generative research is to formulate and test hypotheses about which aspects those are.[4][5] In day-to-day generative research, the notion that universal grammar exists motivates analyses in terms of general principles. As much as possible, facts about particular languages are derived from these general principles rather than from language-specific stipulations.[4]

Evidence

[edit]

The idea that at least some aspects are innate is motivated by poverty of the stimulus arguments.[6][7] For example, one famous poverty of the stimulus argument concerns the acquisition of yes–no questions in English. This argument starts from the observation that children only make mistakes compatible with rules targeting hierarchical structure even though the examples which they encounter could have been generated by a simpler rule that targets linear order. In other words, children seem to ignore the possibility that the question rule is as simple as "switch the order of the first two words" and immediately jump to alternatives that rearrange constituents in tree structures. This is taken as evidence that children are born knowing that grammatical rules involve hierarchical structure, even though they have to figure out what those rules are.[6][7][8]

Theories of universal grammar

[edit]

Within generative grammar, there are a variety of theories about what universal grammar consists of. One notable hypothesis proposed by Hagit Borer holds that the fundamental syntactic operations are universal and that all variation arises from different feature-specifications in the lexicon.[5][9] On the other hand, a strong hypothesis adopted in some variants of Optimality Theory holds that humans are born with a universal set of constraints, and that all variation arises from differences in how these constraints are ranked.[5][10] In a 2002 paper, Noam Chomsky, Marc Hauser and W. Tecumseh Fitch proposed that universal grammar consists solely of the capacity for hierarchical phrase structure.[11]

The main hypotheses

[edit]

In an article entitled "The Faculty of Language: What Is It, Who Has It, and How Did It Evolve?"[12] Hauser, Chomsky, and Fitch present the three leading hypotheses for how language evolved and brought humans to the point where they have a universal grammar.

The first hypothesis states that the faculty of language in the broad sense (FLb) is strictly homologous to animal communication. This means that homologous aspects of the faculty of language exist in non-human animals.

The second hypothesis states that the FLb is a derived and uniquely human adaptation for language. This hypothesis holds that individual traits were subject to natural selection and came to be specialized for humans.

The third hypothesis states that only the faculty of language in the narrow sense (FLn) is unique to humans. It holds that while mechanisms of the FLb are present in both human and non-human animals, the computational mechanism of recursion has evolved recently, and solely in humans.[13]

Presence of creole languages

[edit]

The presence of creole languages is sometimes cited as further support for this theory, especially by Bickerton's language bioprogram theory. Creole languages develop and form when disparate societies with no common language come together and are forced to devise a new system of communication. The system used by the original speakers is typically an inconsistent mix of vocabulary items, known as pidgin. As these speakers' children begin to acquire their first language, they use the pidgin input to effectively create their own original language, known as a creole language. Unlike pidgins, creole languages have native speakers (those with language acquisition from early childhood) and make use of a full, systematic grammar.

Bickerton claims the fact that certain features are shared by virtually all creole languages supports the notion of a universal grammar. For example, their default point of reference in time (expressed by bare verb stems) is not the present moment, but the past. Using pre-verbal auxiliaries, they uniformly express tense, aspect, and mood. Negative concord occurs, but it affects the verbal subject (as opposed to the object, as it does in languages like Spanish). Another similarity among creole languages can be identified in the fact that questions are created simply by changing the intonation of a declarative sentence; not its word order or content.

Opposing this notion, the work by Carla Hudson-Kam and Elissa Newport suggests that creole languages may not support a universal grammar at all. In a series of experiments, Hudson-Kam and Newport looked at how children and adults learn artificial grammars. They found that children tend to ignore minor variations in the input when those variations are infrequent, and reproduce only the most frequent forms. In doing so, the children tend to standardize the language they hear around them. Hudson-Kam and Newport hypothesize that in a pidgin-development situation (and in the real-life situation of a deaf child whose parents are or were disfluent signers), children systematize the language they hear, based on the probability and frequency of forms, and not that which has been suggested on the basis of a universal grammar.[14][15] Further, they argue, it seems to follow that creole languages would share features with the languages from which they are derived, and thus look similar in terms of grammar.

Many researchers of universal grammar argue against the concept of relexification, i.e. that a language replaces its lexicon almost entirely with that of another. This, they argue, goes against the universalist notions of a universal grammar, which has an innate grammar.[citation needed]

Views and assessments

[edit]

Recent research has used recurrent neural network architectures (RNNs). Christian et al (2018) focused on a strong version of the poverty-of-the-stimulus argument, which claims that language learners require a hierarchical constraint, although they report that a milder version, which only asserts that a hierarchical bias is necessary, is difficult to assess using RNNs because RNNs must possess some biases and the nature of these biases remains "currently poorly understood." They go on to acknowledge that while all the architectures they used had a bias toward linear order and the GRU-with-attention architecture was the only one that overcame this linear bias sufficiently to generalize hierarchically. "Humans certainly could have such an innate constraint."[16]

The empirical basis of poverty-of-the-stimulus arguments has been challenged by Geoffrey Pullum and others, leading to a persistent back-and-forth debate in the language acquisition literature.[17][18]

Language acquisition researcher Michael Ramscar has suggested that when children erroneously expect an ungrammatical form that then never occurs, the repeated failure of expectation serves as a form of implicit negative feedback that allows them to correct their errors over time, in the way that, for example, children correct grammar generalizations like goed to went through repetitive failure.[19][20]

In addition, it has been suggested that people learn about probabilistic patterns of word distribution in their language, rather than hard and fast rules (see Distributional hypothesis).[21] For example, in English, children overgeneralize the past tense marker "-ed" and conjugate irregular verbs as if they were regular, producing forms like goed and eated, and then correct this deviancy over time.[19] It has also been hypothesized that the poverty of the stimulus problem can be largely avoided if it is assumed that children employ similarity-based generalization strategies in language learning, i.e. generalizing about the usage of new words from similar words they already know how to use.[22]

Neurogeneticists Simon Fisher and Sonja Vernes observe that, with human language-skills being evidently unmatched elsewhere in the world's fauna, there have been several theories about one single mutation event occurring some time in the past in our nonspeaking ancestors, as argued by e.g. Chomsky (2011), i.e. some "lone spark that was sufficient to trigger the sudden appearance of language and culture." They characterize that notion as "romantic" and "inconsistent with the messy mappings between genetics and cognitive processes." According to Fisher & Vernes, the link between genes to grammar has not been consistently mapped by scientists. What has been established by research, they claim, relates primarily to speech pathologies. The arising lack of certainty, they conclude, has provided an audience for "unconstrained speculations" that have fed the "myth" of "so-called grammar genes".[23]

Professor of Natural Language Computing Geoffrey Sampson maintains that universal grammar theories are not falsifiable and are therefore pseudoscientific. He argues that the grammatical "rules" linguists posit are simply post-hoc observations about existing languages, rather than predictions about what is possible in a language.[24][25] Similarly, Jeffrey Elman argues that the unlearnability of languages ostensibly assumed by universal grammar is based on a too-strict, "worst-case" model of grammar, which is not in keeping with any actual grammar. James Hurford argues that the postulate of a language acquisition device (LAD) essentially amounts to the trivial statement that languages are learnt by humans, and thus, that the LAD is less a theory than an explanandum looking for a theory.[26]

Morten H. Christiansen and Nick Chater have argued that the relatively fast-changing nature of language would prevent the slower-changing genetic structures from ever catching up, undermining the possibility of a genetically hard-wired universal grammar. Instead of an innate universal grammar, they claim, "apparently arbitrary aspects of linguistic structure may result from general learning and processing biases deriving from the structure of thought processes, perceptuo-motor factors, cognitive limitations, and pragmatics".[27]

Wolfram Hinzen, in his work The philosophical significance of Universal Grammar[28] seeks to re-establish the epistemological significance of grammar and addresses the three main current objections to Cartesian universal grammar, i.e. that it has no coherent formulation, it cannot have evolved by standard, accepted neo-Darwinian evolutionary principles, and it goes against the variation extant at all levels of linguistic organization, which lies at the heart of human faculty of language.

In the domain of field research, Daniel Everett has claimed that the Pirahã language is a counterexample to the basic tenets of universal grammar because it lacks clausal embedding. According to Everett, this trait results from Pirahã culture emphasizing present-moment concrete matters.[29] Other linguists have responded that Pirahã does in fact have clausal embedding, and that, even if it did not, this would be irrelevant to current theories of universal grammar. Nevins et al (2007) argued against each of Everett's claims and, using Everett's "rich material" data, found no evidence of a causal relation between culture and grammatical structure. Pirahã grammar, they concluded, presents no unusual challenge, much less the "severe" one claimed by Everett, to the notion of a universal grammar.[30]

Developments

[edit]

The modern conception of universal grammar is generally attributed to Noam Chomsky, yet similar ideas are found in older work. A related idea is found in Roger Bacon's c. 1245 Overview of Grammar and c. 1268 Greek Grammar, where he postulates that all languages are built upon a common grammar, even though it may undergo incidental variations. In the 13th century, the speculative grammarians postulated universal rules underlying all grammars.[citation needed]

The concept of a universal grammar or language was at the core of the 17th century projects for philosophical languages. An influential work in that time was Grammaire générale by Claude Lancelot and Antoine Arnauld. They describe a general grammar for languages, coming to the conclusion that grammar has to be universal.[31] There is a Scottish school of universal grammarians from the 18th century, as distinguished from the philosophical language project, which included authors such as James Beattie, Hugh Blair, James Burnett, James Harris, and Adam Smith.

The article on grammar in the first edition of the Encyclopædia Britannica (1771) contains an extensive section titled "Of Universal Grammar," under the lemma "Grammar.".[32]

In the late 19th and early 20th century, Wilhelm Wundt and Otto Jespersen claimed that these earlier arguments were overly influenced by Latin and ignored the breadth of worldwide language rsal grammar", but reduced it to universal syntactic categories or super-categories, such as number, tenses, etc.[33]

Behaviorists, after the rise of the eponymous theory, advanced the idea that language acquisition, like any other kind of learning, could be explained by a succession of trials, errors, and rewards for success.[34] In other words, children learn their mother tongue by simple imitation, through listening and repeating what adults say. For example, when a child says "milk" and the mother will smile and give milk to her childas a result, the child will find this outcome rewarding, thus enhancing the child's language development.[35]

In 2017, Chomsky and Berwick co-wrote their book titled Why Only Us, where they defined both the minimalist program and the strong minimalist thesis and its implications, to update their approach to UG theory. According to Berwick and Chomsky, "the optimal situation would be that UG reduces to the simplest computational principles which operate in accord with conditions of computational efficiency. This conjecture is ... called the Strong Minimalist Thesis (SMT)."[36]: 94 

The significance of SMT is to shift the previous emphasis on a universal grammar to the concept that Chomsky and Berwick now call "merge". "Merge" is defined there as follows:

Every computational system has embedded within it somewhere an operation that applies to two objects X and Y already formed, and constructs from them a new object Z. Call this operation Merge.

SMT dictates that "Merge will be as simple as possible: it will not modify X or Y or impose any arrangement on them; in particular, it will leave them unordered; an important fact. Merge is therefore just set formation: Merge of X and Y yields the set {X, Y}."[36]: 98 

See also

[edit]

Notes

[edit]
  1. ^ Chomsky, Noam. "Tool Module: Chomsky's Universal Grammar". Retrieved 2010-10-07.
  2. ^ Evans, Nicholas; Levinson, Stephen C. (26 October 2009). "The myth of language universals: Language diversity and its importance for cognitive science". Behavioral and Brain Sciences. 32 (5): 429–48. doi:10.1017/S0140525X0999094X. hdl:11858/00-001M-0000-0012-C29E-4. PMID 19857320. S2CID 2675474. Archived (PDF) from the original on 27 July 2018.
  3. ^ Christensen, Christian Hejlesen (March 2019). "Arguments for and against the Idea of Universal Grammar". Leviathan (4): 12–28. doi:10.7146/lev.v0i4.112677. S2CID 172055557. Retrieved 1 May 2025.
  4. ^ a b Wasow, Thomas (2003). "Generative Grammar" (PDF). In Aronoff, Mark; Ress-Miller, Janie (eds.). The Handbook of Linguistics. Blackwell. p. 299. doi:10.1002/9780470756409.ch12. ISBN 978-0-631-20497-8.
  5. ^ a b c Pesetsky, David (1999). "Linguistic universals and universal grammar". In Wilson, Robert; Keil, Frank (eds.). The MIT encyclopedia of the cognitive sciences. MIT Press. pp. 476–478. doi:10.7551/mitpress/4660.001.0001. ISBN 978-0-262-33816-5.
  6. ^ a b Adger, David (2003). Core syntax: A minimalist approach. Oxford University Press. pp. 8–11. ISBN 978-0199243709.
  7. ^ a b Lasnik, Howard; Lidz, Jeffrey (2017). "The Argument from the Poverty of the Stimulus" (PDF). In Roberts, Ian (ed.). The Oxford Handbook of Universal Grammar. Oxford University Press.
  8. ^ Crain, Stephen; Nakayama, Mineharu (1987). "Structure dependence in grammar formation". Language. 63 (3): 522–543. doi:10.2307/415004. JSTOR 415004.
  9. ^ Gallego, Ángel (2012). "Parameters". In Boeckx, Cedric (ed.). The Oxford Handbook of Linguistic Minimalism. Oxford University Press. doi:10.1093/oxfordhb/9780199549368.013.0023.
  10. ^ McCarthy, John (1992). Doing optimality theory. Wiley. pp. 1–3. ISBN 978-1-4051-5136-8.
  11. ^ Hauser, Marc; Chomsky, Noam; Fitch, W. Tecumseh (2002). "The faculty of language: what is it, who has it, and how did it evolve". Science. 298 (5598): 1569–1579. doi:10.1126/science.298.5598.1569. PMID 12446899.
  12. ^ Hauser, Marc; Chomsky, Noam; Fitch, William Tecumseh (22 November 2002), "The Faculty of Language: What Is It, Who Has It, and How Did It Evolve?" (PDF), Science, 298 (5598): 1569–1579, doi:10.1126/science.298.5598.1569, PMID 12446899, archived from the original (PDF) on 28 December 2013, retrieved 28 December 2013
  13. ^ Hauser, Marc; Chomsky, Noam; Fitch, William Tecumseh (22 November 2002), "The Faculty of Language: What Is It, Who Has It, and How Did It Evolve?" (PDF), Science, 298 (5598): 1569–1579, doi:10.1126/science.298.5598.1569, PMID 12446899, archived from the original (PDF) on 28 December 2013, retrieved 11 April 2024, We hypothesize that FLN only includes recursion and is the only uniquely human component of the faculty of language. [...] the core recursive aspect of FLN currently appears to lack any analog in animal communication and possibly other domains as well.
  14. ^ Hudson Kam, C. L.; Newport, E. L. (2009). "Getting it right by getting it wrong: When learners change languages". Cognitive Psychology. 59 (1): 30–66. doi:10.1016/j.cogpsych.2009.01.001. PMC 2703698. PMID 19324332.
  15. ^ Dye, Melody (February 9, 2010). "The Advantages of Being Helpless". Scientific American. Retrieved June 10, 2014.
  16. ^ McCoy, R. Thomas; Frank, Robert; Linzen, Tal (2018). "Revisiting the poverty of the stimulus: hierarchical generalization without a hierarchical bias in recurrent neural networks" (PDF). Proceedings of the 40th Annual Conference of the Cognitive Science Society: 2093–2098. arXiv:1802.09091.
  17. ^ Pullum, Geoff; Scholz, Barbara (2002). "Empirical assessment of stimulus poverty arguments". The Linguistic Review. 18 (1–2): 9–50. doi:10.1515/tlir.19.1-2.9.
  18. ^ Legate, Julie Anne; Yang, Charles (2002). "Empirical re-assessment of stimulus poverty arguments" (PDF). The Linguistic Review. 18 (1–2): 151–162. doi:10.1515/tlir.19.1-2.9.
  19. ^ a b Fernández, Eva M.; Helen Smith Cairns (2011). Fundamentals of Psycholinguistics. Chichester, West Sussex, England: Wiley-Blackwell. ISBN 978-1-4051-9147-0.
  20. ^ Ramscar, Michael; Yarlett, Daniel (2007). "Linguistic self-correction in the absence of feedback: A new approach to the logical problem of language acquisition". Cognitive Science. 31 (6): 927–960. CiteSeerX 10.1.1.501.4207. doi:10.1080/03640210701703576. PMID 21635323. S2CID 2277787.
  21. ^ McDonald, Scott; Ramscar, Michael (2001). "Testing the distributional hypothesis: The influence of context on judgements of semantic similarity". Proceedings of the 23rd Annual Conference of the Cognitive Science Society: 611–616. CiteSeerX 10.1.1.104.7535.
  22. ^ Yarlett, Daniel G.; Ramscar, Michael J. A. (2008). "Language Learning Through Similarity-Based Generalization" (PDF). draft. Stanford University. CiteSeerX 10.1.1.393.7298.
  23. ^ Fisher, Simon E.; Vernes, Sonja C. (January 2015). "Genetics and the Language Sciences". Annual Review of Linguistics. 1: 289–310. doi:10.1146/annurev-linguist-030514-125024. hdl:11858/00-001M-0000-0019-DA19-1. Retrieved 1 May 2025.
  24. ^ Sampson, Geoffrey (2005). The 'Language Instinct' Debate: Revised Edition. Bloomsbury Academic. ISBN 978-0-8264-7385-1.
  25. ^ Cipriani, Enrico (2015). "The generative grammar between philosophy and science". European Journal of Literature and Linguistics. 4: 12–16.
  26. ^ Hurford, James R. (1995). "Nativist and Functional Explanations in Language Acquisition" (PDF). In I. M. Roca (ed.). Logical Issues in Language Acquisition. Dordrecht, Holland and Providence, Rhode Island: Foris Publications. p. 88. Archived (PDF) from the original on 2022-10-09. Retrieved June 10, 2014.
  27. ^ Christiansen, Morten H.; Chater, Nick (January 1985). "Language as Shaped by the Brain". Behavioral and Brain Sciences. 31 (5): 489–508. doi:10.1017/S0140525X08004998.
  28. ^ Hinzen, Wolfram (September 2012). "The philosophical significance of Universal Grammar". Language Sciences. 34 (5): 635–649. doi:10.1016/j.langsci.2012.03.005.
  29. ^ Everett, Daniel L. (August–October 2005). "Cultural Constraints on Grammar and Cognition in Pirahã: Another Look at the Design Features of Human Language" (PDF). Current Anthropology. 46 (4): 621–646. doi:10.1086/431525. hdl:2066/41103. S2CID 2223235. Archived (PDF) from the original on 2022-10-09.
  30. ^ Nevins, Andrew; Pesetsky, David; Rodrigues, Cilene (March 8, 2007). "Pirahã Exceptionality: a Reassessment" (PDF). International Cognition & Culture Institute. Retrieved May 1, 2025.
  31. ^ Lancelot, Claude, 1615?–1695 (1967). Grammaire generale et raisonnee, 1660. Scolar Press. OCLC 367432981.{{cite book}}: CS1 maint: multiple names: authors list (link) CS1 maint: numeric names: authors list (link)
  32. ^ "Of Universal Grammar". Encyclopædia Britannica. 2 (1st ed.). National Library of Scotland: 728–9. 1771. Retrieved 1 May 2025.
  33. ^ Jespersen 1965, p. 53.
  34. ^ Chomsky, Noam. "Tool Module: Chomsky's Universal Grammar". Retrieved 2010-10-07.
  35. ^ Ambridge & Lieven, 2011.
  36. ^ a b Chomsky, Noam; Berwick, Robert C. (12 May 2017). Why Only Us?. MIT Press. ISBN 9780262533492.

References

[edit]
  • Ambridge, Ben; Lieven, Elena V. M. (2011-03-17). Child Language Acquisition. Cambridge University Press. ISBN 978-0-521-76804-7.
  • Baker, Mark C. The Atoms of Language: The Mind's Hidden Rules of Grammar. Oxford University Press, 2003. ISBN 0-19-860632-X.
  • Beattie, James. "Of Universal Grammar". Section II, The Theory of Language (1788). Rpt in Dissertations Moral and Critical (1783, 1986.)
  • Blair, Hugh. Lecture 6, 7, and 8, Lectures on Rhetoric and Belles Lettres, (1783). Rpt New York: Garland, 1970.
  • Burnett, James. Of the Origin and Progress of Language. Edinburgh, 1774–1792.
  • Chomsky, Noam (2007), "Approaching UG from Below", Interfaces + Recursion = Language?, DE GRUYTER, pp. 1–30, doi:10.1515/9783110207552-001, ISBN 9783110207552
  • Chomsky, N. Aspects of the Theory of Syntax. MIT Press, 1965. ISBN 0-262-53007-4.
  • Chomsky, Noam (2017), "The Galilean Challenge: Architecture and Evolution of Language", Journal of Physics: Conference Series, 880 (1): 012015, Bibcode:2017JPhCS.880a2015C, doi:10.1088/1742-6596/880/1/012015, ISSN 1742-6588
  • Elman, J., Bates, E. et al. Rethinking innateness. MIT Press, 1996.
  • Harris, James. Hermes or A Philosophical Inquiry Concerning Universal Grammar. (1751, 1771.)
  • Jespersen, Otto (1965) [1924], The Philosophy of Grammar, Norton
  • Kliesch, C. (2012). Making sense of syntax – Innate or acquired? Contrasting universal grammar with other approaches to language acquisition. Journal of European Psychology Students, 3, 88–94,
  • Lancelot, Claude; Arnauld, Antoine (1968) [1660], Grammaire générale et raisonnée contenant les fondemens de l'art de parler, expliqués d'une manière claire et naturelle, Slatkine Reprints
  • "Of Universal Grammar". In "Grammar". Encyclopædia Britannica, (1771).
  • Pesetsky, David. "Linguistic Universals and Universal Grammar". In The MIT Encyclopedia of the Cognitive Sciences. Ed. Robert A. Wilson and Frank C. Keil Cambridge, MA: MIT Press 1999.
  • Sampson, G. The "Language Instinct" Debate. Continuum International Publishing Group, 2005. ISBN 0-8264-7384-9.
  • Smith, Adam. "Considerations Concerning the First Formation of Languages". In Lectures on Rhetoric and Belles Lettres. Ed. J. C. Bryce. Indianapolis: Liberty Press, 1983, 203–226.
  • Smith, Adam. "Of the Origin and Progress of Language". Lecture 3, Lectures on Rhetoric and Belles Lettres. Ed. J. C. Bryce. Indianapolis: Liberty Press, 1983, 9–13.
  • Tomasello, M. Constructing a Language: A Usage-Based Theory of Language Acquisition. Harvard University Press, 2003. ISBN 0-674-01030-2.
  • Valian, Virginia (1986), "Syntactic Categories in the Speech of Young Children", Developmental Psychology, 22 (4): 562–579, doi:10.1037/0012-1649.22.4.562
  • Window on Humanity. A Concise Introduction to Anthropology. Conrad Phillip Kottak. Ed. Kevin Witt, Jill Gordon. The McGraw-Hill Companies, Inc. 2005.
  • White, Lydia. "Second Language Acquisition and Universal Grammar". Cambridge University Press, 2003. ISBN 0-521-79647-4
  • Zuidema, Willem. How the poverty of stimulus solves the poverty of stimulus. "Evolution of Language: Fourth International Conference", Harvard University, March 2002.

Further reading

[edit]