Statistical language models within the algebra of weighted rational languages

Statistical language models are an important tool in natural language processing. They represent prior knowledge about a certain language which is usually gained from a set of samples called a corpus. In this paper, we present a novel way of creating N-gram language models using weighted finite auto...

Full description

Saved in:
Bibliographic Details
Main Authors: Hanneforth Thomas
Würzner Kay-Michael
Corporate Author: Weighted Automata : Theory and Applications (2008) (Dresden)
Format: Article
Published: 2009
Series:Acta cybernetica 19 No. 2
Kulcsszavak:Számítástechnika, Kibernetika
Subjects:
Online Access:http://acta.bibl.u-szeged.hu/12868
Description
Summary:Statistical language models are an important tool in natural language processing. They represent prior knowledge about a certain language which is usually gained from a set of samples called a corpus. In this paper, we present a novel way of creating N-gram language models using weighted finite automata. The construction of these models is formalised within the algebra underlying weighted finite automata and expressed in terms of weighted rational languages and transductions. Besides the algebra we make use of five special constant weighted transductions which rely only on the alphabet and the model parameter N. In addition, we discuss efficient implementations of these transductions in terms of virtual constructions.
Physical Description:313-356
ISSN:0324-721X