A learnability argument for constraints on underlying representations
Ezer Rasin, Roni Katzir
September 2017
 

Speakers judge some nonce forms as nonexistent but possible – that is, as accidental gaps – and other nonce forms as nonexistent and impossible – that is, as systematic gaps. Early generative approaches accounted for systematic gaps through a combination of two factors: constraints on underlying representations in the lexicon; and phonological rules. Contrasting with this view, Optimality Theory has been guided by the idea that phonological generalizations are captured not in the lexicon but rather on the surface or in the mapping from URs to surface forms. The view that there are no constraints on URs is often referred to as Richness of the Base (ROTB), and it is a central tenet of OT. Our goal in this note is to re-open the question of whether OT requires constraints on URs and offer a learnability argument supporting an affirmative answer, thus arguing against ROTB. We start by examining the extant literature on learning in OT and argue that the learners proposed there overgeneralize (by treating some systematic gaps as accidental), undergeneralize (by treating some accidental gaps as systematic), or both. We then discuss a different approach to learning, compression-based learning, that is the only approach currently available that can handle the data in principle without over- or undergeneralization. We show that compression-based learning learns certain naturally-occurring patterns, but crucially only if it rejects ROTB and employs language-specific constraints on URs.
Format: [ pdf ]
Reference: lingbuzz/002260
(please use that when you cite this article)
Published in: Submitted
keywords: learning, evaluation metrics, minimum description length, optimality theory, morpheme structure constraints, richness of the base, phonology
previous versions: v1 [October 2014]
Downloaded:1895 times

 

[ edit this article | back to article list ]