We present a prototype model, based on a combination of count-based distributional semantics and prediction-based neural word embeddings, which learns about syntactic categories as a function of (1) writing contextual, phonological, and lexical-stress-related information to memory and (2) predicting upcoming context words based on memorized information. The system is a first step towards utilizing recently popular methods from Natural Language Processing for exploring the role of prediction in childrens acquisition of syntactic categories.
|Title of host publication||Proceedings of the 6th Workshop on Cognitive Aspects of Computational Language Learning|
|Publisher||Association for Computational Linguistics (ACL)|
|Number of pages||5|
|Publication status||Published - Sep 2015|