University of David
For instance, one factor that makes the computer-brain analogy seem so plausible is the ubiquitous talk of “information.” The word is often thrown around with total disregard for its roots in the lifeworld — specifically, the world of mid-20th-century communications. The seminal work in information theory is Claude Shannon’s 1948 paper “A Mathematical Theory of Communication,” which is mainly about the efficiency with which a certain sequence (say, a set of dots and dashes) can be transmitted and reproduced. There is no reference here to truth, awareness or understanding. As Shannon puts it, the “semantic aspects of communication are irrelevant to the engineering problem.” But concepts from information theory, in this restricted sense, have come to influence our notions of “information” in the broader sense, where the word suggests significance and learning. This may be deeply misleading. Why should we assume that thinking and perceiving are essentially information processing? Our communication devices are an important part of our lifeworld, but we can’t understand the whole in terms of the part.