Lexicon-grammar

Lexicon-Grammar is a method and a praxis of formalized description of human languages, which considers that the systematic investigation of lexical entries is presently the main challenge of the scientific study of languages. The development of Lexicon-Grammar began in the late 1960s under Maurice Gross.

Its theoretical basis is Zellig S. Harris's distributionalism, and notably the notion of transformational rule. The notational conventions are meant to be as clear and comprehensible as possible.

The method of Lexicon-Grammar is inspired from hard sciences. It focuses on data collection, hence on the real use of language, both from a quantitative and observational point of view.

Lexicon-grammar also requires formalisation. The results of the description must be sufficiently formal to be usable for natural language processing, in particular through the development of parsers.

The information model is such that the results of the description take the form of two-dimensional tables, also called matrices, which cross-tabulate entries with syntactic-semantic properties. As a result, the Lexicon-grammar addresses "the problem of correlations between syntax and the lexicon."

Experiments showed that several researchers or teams can make their observations cumulative.

The term lexicon-grammar is used for the first time by Annibale Elia, after grammar-lexicon by Carl Vikner.