The Information Paradox
By Peter J. Denning, Tim Bell
Classical information theory has no room for meaning--but humans persist in assigning meaning. How can we reconcile this difference?
Classical information theory has no room for meaning--but humans persist in assigning meaning. How can we reconcile this difference?
DOI: 10.1511/2012.99.470
Since mathematician and communications engineer Claude Shannon developed information theory in the 1940s, the study of information has advanced rapidly. But along with the development of the field and its widespread effects on people’s lives has appeared a series of perplexing questions. Most scientific disciplines rely on information-processing methods to discover new knowledge, and many scientists now say that information processes appear in nature. Even so, our models of these processes assume that the processes evolve without depending in any way on the meaning of the information they contain. How can such processes generate new knowledge?
Photograph courtesy of IBM.
Click "American Scientist" to access home page
American Scientist Comments and Discussion
To discuss our articles or comment on them, please share them and tag American Scientist on social media platforms. Here are links to our profiles on Twitter, Facebook, and LinkedIn.
If we re-share your post, we will moderate comments/discussion following our comments policy.