» Articles » PMID: 35205553

An Information Theoretic Approach to Symbolic Learning in Synthetic Languages

Overview
Journal Entropy (Basel)
Publisher MDPI
Date 2022 Feb 25
PMID 35205553
Authors
Affiliations
Soon will be listed here.
Abstract

An important aspect of using entropy-based models and proposed "synthetic languages", is the seemingly simple task of knowing how to identify the probabilistic symbols. If the system has discrete features, then this task may be trivial; however, for observed analog behaviors described by continuous values, this raises the question of how we should determine such symbols. This task of symbolization extends the concept of scalar and vector quantization to consider explicit linguistic properties. Unlike previous quantization algorithms where the aim is primarily data compression and fidelity, the goal in this case is to produce a symbolic output sequence which incorporates some linguistic properties and hence is useful in forming language-based models. Hence, in this paper, we present methods for symbolization which take into account such properties in the form of probabilistic constraints. In particular, we propose new symbolization algorithms which constrain the symbols to have a Zipf-Mandelbrot-Li distribution which approximates the behavior of language elements. We introduce a novel constrained EM algorithm which is shown to effectively learn to produce symbols which approximate a Zipfian distribution. We demonstrate the efficacy of the proposed approaches on some examples using real world data in different tasks, including the translation of animal behavior into a possible human language understandable equivalent.

Citing Articles

Estimating Sentence-like Structure in Synthetic Languages Using Information Topology.

Back A, Wiles J Entropy (Basel). 2022; 24(7).

PMID: 35885083 PMC: 9317616. DOI: 10.3390/e24070859.

References
1.
Peperkamp S . Phonological acquisition: recent attainments and new challenges. Lang Speech. 2004; 46(Pt 2-3):87-113. DOI: 10.1177/00238309030460020401. View

2.
Back A, Wiles J . Entropy Estimation Using a Linguistic Zipf-Mandelbrot-Li Model for Natural Sequences. Entropy (Basel). 2021; 23(9). PMC: 8468050. DOI: 10.3390/e23091100. View

3.
Kim J, Andre E . Emotion recognition based on physiological changes in music listening. IEEE Trans Pattern Anal Mach Intell. 2008; 30(12):2067-83. DOI: 10.1109/TPAMI.2008.26. View

4.
Allen B, Kon M, Bar-Yam Y . A new phylogenetic diversity measure generalizing the shannon index and its application to phyllostomid bats. Am Nat. 2009; 174(2):236-43. DOI: 10.1086/600101. View

5.
Flipsen P . Measuring the intelligibility of conversational speech in children. Clin Linguist Phon. 2006; 20(4):303-12. DOI: 10.1080/02699200400024863. View