This article argues that language cannot be a recursive-embedding system in the terms of Chomsky (1965 et seq.) but must simply be a discrete combinatorial system, basically in the sense of dependency grammar. It argues that the recursive-embedding model is a misconception that has had some severe consequences for the explanatory value of generative grammar, especially during the last fifteen years, leaving the theory with essentially only one syntactic relation (that between a head and its complement, including everything that the complement contains). Crucially, it is shown that the recursive-embedding model in its present form, working from the bottom up and, as in the case of English, from right to left, cannot handle discrete infinity, as it entails that the derivation does not have a beginning (it is the starting point that is pushed back into infinity, not the end result). Moreover, it cannot manage external arguments. Furthermore, it is pointed out that the model is not compatible with the way in which sentence production and processing work. Instead, the paper offers a simpler model, namely that lexical items are atomic units that combine, rather than merge, to form syntactic structures similar to that of chemical compounds, more or less in the same fashion as in Hudson's (2007, 2010) Word Grammar. One of the many advantages of the discrete combinatorial model is that it permits discrete infinity; another that it allows for the selection of the external argument (the latter declared impossible in Chomsky 2004). Moreover, it offers a minimalist account of the mechanisms behind head raising and affixation, without having to assume any head-movement operation (declared impossible in Chomsky 2001). Lastly, it is demonstrated that the discrete combinatorial model is much more local and economical and that it, like sentence production and comprehension, can derive structures from left to right, and, as in the case of English, top down.
©2014 by Walter de Gruyter Berlin/Boston