A role for backward transitional probabilities in word segmentation?

Catégorie

Journal Article

Auteurs

Perruchet, P., Desaulty, S.

Année

2008

Titre

A role for backward transitional probabilities in word segmentation?

Journal / Livre / Conférence

Memory & Cognition

Résumé

A number of studies have shown that people exploit transitional probabilities between successive syllables to segment a stream of artificial continuous speech into words. It is often assumed that what is actually exploited are the forward transitional probabilities (given XY, the probability for X to be followed by Y), even though the backward transitional probabilities (the probability for Y to have been preceded by X) are equally informative about word structure in the languages involved in those studies. In two experiments, we showed that participants were able to learn the words from an artificial speech stream when the only available cues were the backward transitional probabilities. Learning is as good under those conditions as when the only available cues are the forward transitional probabilities. Implications for some current models of word segmentation, particularly the Simple Recurrent Networks and PARSER, are discussed.

Issue

7

Volume

36

Pages

1299-1305

Téléchargement

Télécharger cette publication au format PDF

‹ Retour à la page précédente