Automatic pitch contour stylization using a model of tonal perception

A new quantitative model of tonal perception for continuous speech is described. The paper illustrates its ability for automatic stylization of pitch contours, with applications to prosodic analysis and speech synthesis in mind, and evaluates it in a perception experiment. After a discussion of the...

Full description

Saved in:
Bibliographic Details
Published inComputer speech & language Vol. 9; no. 3; pp. 257 - 288
Main Authors d»Alessandro, Christophe, Mertens, Piet
Format Journal Article
LanguageEnglish
Published Oxford Elsevier Ltd 01.07.1995
Elsevier
Academic Press
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:A new quantitative model of tonal perception for continuous speech is described. The paper illustrates its ability for automatic stylization of pitch contours, with applications to prosodic analysis and speech synthesis in mind, and evaluates it in a perception experiment. After a discussion of the psycho-acoustics of tonal perception, and an overview of existing tonal perception models and systems for automatic analysis of intonation, the model and its computer implementation are described in detail. It includes parameter extraction, segmentation into syllables, perceptual integration of short term pitch change, tonal segment computation, and pitch contour stylization. This is followed by a perception experiment in which subjects are asked to distinguish original signals from resynthesized signals with automatically stylized pitch contours. The aim of this experiment is to show the usefulness of the model as a basis for intonation representation, and to study the influence of the model parameters. It is shown that the stylization obtained with the model is an economic representation of intonation which can be useful for speech synthesis and prosodic analysis.
Bibliography:ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 14
content type line 23
ObjectType-Article-1
ObjectType-Feature-2
ISSN:0885-2308
1095-8363
DOI:10.1006/csla.1995.0013