Jump to ContentJump to Main Navigation
Advances in Info-MetricsInformation and Information Processing across Disciplines$
Users without a subscription are not able to see the full content.

Min Chen, J. Michael Dunn, Amos Golan, and Aman Ullah

Print publication date: 2020

Print ISBN-13: 9780190636685

Published to Oxford Scholarship Online: December 2020

DOI: 10.1093/oso/9780190636685.001.0001

Show Summary Details
Page of

PRINTED FROM OXFORD SCHOLARSHIP ONLINE (oxford.universitypressscholarship.com). (c) Copyright Oxford University Press, 2021. All Rights Reserved. An individual user may print out a PDF of a single chapter of a monograph in OSO for personal use. date: 19 June 2021

A Computational Theory of Meaning

A Computational Theory of Meaning

Chapter:
(p.32) 2 A Computational Theory of Meaning
Source:
Advances in Info-Metrics
Author(s):

Pieter Adriaans

Publisher:
Oxford University Press
DOI:10.1093/oso/9780190636685.003.0002

A computational theory of meaning tries to understand the phenomenon of meaning in terms of computation. Here we give an analysis in the context of Kolmogorov complexity. This theory measures the complexity of a data set in terms of the length of the smallest program that generates the data set on a universal computer. As a natural extension, the set of all programs that produce a data set on a computer can be interpreted as the set of meanings of the data set. We give an analysis of the Kolmogorov structure function and some other attempts to formulate a mathematical theory of meaning in terms of two-part optimal model selection. We show that such theories will always be context dependent: the invariance conditions that make Kolmogorov complexity a valid theory of measurement fail for this more general notion of meaning. One cause is the notion of polysemy: one data set (i.e., a string of symbols) can have different programs with no mutual information that compresses it. Another cause is the existence of recursive bijections between ℕ and ℕ2 for which the two-part code is always more efficient. This generates vacuous optimal two-part codes. We introduce a formal framework to study such contexts in the form of a theory that generalizes the concept of Turing machines to learning agents that have a memory and have access to each other’s functions in terms of a possible world semantics. In such a framework, the notions of randomness and informativeness become agent dependent. We show that such a rich framework explains many of the anomalies of the correct theory of algorithmic complexity. It also provides perspectives for, among other things, the study of cognitive and social processes. Finally, we sketch some application paradigms of the theory.

Keywords:   philosophy of information, two-part code optimization, Kolmogorov complexity, meaningful information, possible world semantics, degree of informativeness

Oxford Scholarship Online requires a subscription or purchase to access the full text of books within the service. Public users can however freely search the site and view the abstracts and keywords for each book and chapter.

Please, subscribe or login to access full text content.

If you think you should have access to this title, please contact your librarian.

To troubleshoot, please check our FAQs , and if you can't find the answer there, please contact us .