The Information Paradox

Classical information theory has no room for meaning--but humans persist in assigning meaning. How can we reconcile this difference?

Communications Information Theory

Current Issue

This Article From Issue

November-December 2012

Volume 100, Number 6
Page 470

DOI: 10.1511/2012.99.470

Since mathematician and communications engineer Claude Shannon developed information theory in the 1940s, the study of information has advanced rapidly. But along with the development of the field and its widespread effects on people’s lives has appeared a series of perplexing questions. Most scientific disciplines rely on information-processing methods to discover new knowledge, and many scientists now say that information processes appear in nature. Even so, our models of these processes assume that the processes evolve without depending in any way on the meaning of the information they contain. How can such processes generate new knowledge?

Photograph courtesy of IBM.

To access the full article, please log in or subscribe.

American Scientist Comments and Discussion

To discuss our articles or comment on them, please share them and tag American Scientist on social media platforms. Here are links to our profiles on Twitter, Facebook, and LinkedIn.

If we re-share your post, we will moderate comments/discussion following our comments policy.