Abstract
Several mathematical distances between probabilistic languages have been investigated in the literature, motivated by applications in language modeling, computational biology, syntactic pattern matching and machine learning. In most cases, only pairs of probabilistic regular languages were considered. In this paper we extend the previous results to pairs of languages generated by a probabilistic context-free grammar and a probabilistic finite automaton.
Original language | English |
---|---|
Pages (from-to) | 235-254 |
Number of pages | 20 |
Journal | Theoretical Computer Science |
Volume | 395 |
Issue number | 2-3 |
DOIs | |
Publication status | Published - 1 May 2008 |
Keywords
- Probabilistic context-free languages
- Probabilistic finite automata
- Probabilistic language distances
- Language entropy
- Kullback-Leibler divergence
- Relative entropy
- Models