Paul Smolensky (Johns Hopkins University Cognitive Science Department)
Grammatical Theory with Gradient Symbol Structures
How can grammatical theory contribute to, and benefit from, advances in psycholinguistics, computational linguistics, and cognitive neuroscience? I will propose a novel computational framework for grammar — its structure, use and acquisition — providing formal characterizations of grammar at both an abstract, functional level and a neural-network level. The most recent developments in this framework involve linguistic representations in which partially-active symbols occupy blends of positions within discrete structures such as trees. Application of such Gradient Symbolic Computation (GSC) to grammatical competence theory tests a general hypothesis: theoretical disputes about whether phenomenon X should be analyzed using structure A or structure B persist because the correct analysis of X is in fact a blend of the two structures A and B. Application of GSC to language production and comprehension tests whether this type of computation can successfully unify discrete, structure-based theories of competence with gradient, activation-based theories of performance. The general architecture raises important questions about current corpus-based computational linguistic systems that appear to achieve remarkable linguistic performance using unstructured numerical vectors as internal representations (e.g., of semantics), and deep neural networks for learning those representations.