The Linguistic Review
Editor-in-Chief: van der Hulst, Harry
4 Issues per year
IMPACT FACTOR 2016: 0.676
5-year IMPACT FACTOR: 0.831
CiteScore 2016: 0.52
SCImago Journal Rank (SJR) 2015: 0.662
Source Normalized Impact per Paper (SNIP) 2015: 0.573
Exemplar-based models of language propose that human language production and understanding operate with a store of concrete linguistic experiences rather than with abstract linguistic rules. While exemplar-based models are well acknowledged in areas like phonology and morphology, common wisdom has it that they are intrinsically flawed for syntax where infinite generative capacity is needed. This article shows that this common wisdom is wrong. It starts out by reviewing an exemplar-based syntactic model, known as Data-Oriented Parsing, or DOP, which operates on a corpus of phrase-structure trees. While this model is productive, it is inadequate from the point of grammatical productivity. We therefore extend it to the more sophisticated linguistic representations proposed by Lexical-Functional Grammar theory, resulting in the model known as LFG-DOP, which does allow for meta-linguistic judgments of acceptability. We show how DOP deals with first language acquisition, suggesting a unified model for language learning and language use, and go into a number of syntactic phenomena that can be explained by DOP but that challenge rule-based models. We argue that if there is anything innate in language cognition it is not Universal Grammar but “Universal Representation”.
Here you can find all Crossref-listed publications in which this article is cited. If you would like to receive automatic email messages as soon as this article is cited in other publications, simply activate the “Citation Alert” on the top of this page.