This page contains materials
intended to facilitate class discussion
(excerpts from readings, outlines of issues,
links to resources, etc.). The materials
are not necessarily the same as the instructor's
teaching notes and are not designed to represent
a full exposition or argument. This page
is subject to revision as the instructor
finalizes preparation. (Last revised
4/11/07
)
conceptual and logical principles
(e.g., von Neumann computer, stored
program computer)
50s culture, 70s culture
NCSA Beginner's Guide to HTML
Media and Communication
Where we are in the course:
Information
Paradigm
Signature
Technologies
Logical
Architecture
Peak
Epoch (Period of
Monopolistic or Cartel Dominance)
Information
as Mass Media
Radio,
Photography, Film, TV, Magazines
Broadcast
Model
1920s-1970s
Information
as Communication
Telegraphy,
Telephony, Radio
Transmission
Model
1940s-70s
(ATT breakup in 1984)
Information
as Computing I:
Age of the Mainframe
Mainframes
and Minicomputers, Databases
Centralized
information services
1950s-1970s
(1969-82 anti-trust suit against IBM)
Information
as Computing II:
Age of Distributed Computing
PC's
Networks (LAN's, WAN's)
The "Software Revolution"
Graphical User Interface (GUI)
WWW
Client/Server
Architecture
Packetization
1980s-2000s
(1998-2002 anti-trust suit against MS)
The fundamental
relationship between "media"
and "communication"
Consider the notion of the "ancestral
environment" of information:
Albert
Borgmann, Holding On to
Reality: The Nature of Information
at the Turn of the Millennium
(Chicago: Univ. of Chicago
Press, 1999):
"Information
about reality exhibits its
pristine form in a natural
setting. An expanse of smooth
gravel is a sign that you
are close to a river. Cottonwoods
tell you where the river bank
is. An assembly
of twigs in
a tree points to ospreys.
The presence of ospreys shows
that there are trout in the
river. In the original economy
of signs, one thing refers
to another in a settled order
of reference and presence.
A gravel bar seen from a distance
refers you to the river. It
is a sign. When you have reached
and begun to walk on the smooth
and colored stones, the gravel
has become present in its
own right. It is a thing.
And so with the trees, the
nest, the raptors, and the
fish." (p. 1)
"The
ancestral environment is the
ground state of information
and reality. Human beings
evolved in it, and so did
their ability to read its
signs." (p. 24)
How does one share information
across time and space?
evolution of media as
transportable, autonomous
communication
Communications and media
are two sides of the same coin-- humans
(unlike animals) communicate through media
Co-evolution
of 20th-C. Media and Communication
In the middle of the 20th century,
the "media" and "communications
revolutions"were parallel events
that radically increased the transportability
and autonomy of media-as-communications
Mediated communications "thickened"
into an increasingly complex self-regulating
system removed from the "ancestral"
natural and social systems of human
information:
Sender Receiver
Sender Receiver
Marshall McLuhan:
"the medium is the
message"
(the medium is its own
message separate from
the meaning of the message)
Claude Shannon:
the communication
system (bits in the
channel) is the message
"Frequently the
messages have meaning;
that is they refer to
or are correlated according
to some system with certain
physical or conceptual
entities. These semantic
aspects of communication
are irrelevant to the
engineering problem"
The Communications
Revolution: "The Mathematical Theory
of Communication"
Today, we look at the communications
revolution in the mid-20th century,
specifically the "mathematical
theory of communication" (or "information
theory") as it was invented by Claude
Shannon and Warren Weaver of Bell Labs
The theory revolutionized telecommunications
and information processing immediately
after WW II (and epitomized the epoch
of problems in transmission, cryptography,
and ultimately cybernetic technology
that the war focused attention upon)
(cf., Vannevar Bush's "As We May
Think," 1945)
It also had a cultural influence well
beyond its original technological context.
Examples: Thomas Pynchon on entropy,
or A. J. Greimas on narratology. From
A. J.
Greimas, Structural Semantics:
Claude Shannon's "Mathematical
Theory of Communication" (1948):
Some Basic Principles
"The recent development
of various methods of modulation
such as PCM
and PPM
which exchange bandwidth for
signal-to-noise ratio has
intensified the interest in
a general
theory of communication."
"The
fundamental problem of communication
is that of reproducing at one point
either exactly or approximately
a message selected at another point.
Frequently the messages have meaning;
that is they refer to or are correlated
according to some system with certain
physical or conceptual entities.
These semantic
aspects of communication are irrelevant
to the engineering problem. The
significant aspect is that the actual
message is one selected from a set
of possible messages. The
system must be designed to operate
for each possible selection, not
just the one which will actually
be chosen since this is unknown
at the time of design."
Principles:
"Transmission," "Conduit," or "Transport" Model
of Communication:
Shannon: "By
a communication system
we will mean a system
of the type indicated
schematically in Fig.
1. It consists of essentially
five parts:
Restriction of the
channels, roles, and
relations of information
Quarantining of "noise" from "information" (Weaver, pp.
108-109)
Information
as Statistical
(Probability and Entropy):
Information is not
the same as meaning:
"semantic aspects
of communication are
irrelevant"
Information is instead
a mathematical quantity
related to the number
of possible states of
a message (i.e., to
the probability set
from which a message
is selected). Example:
flipping a coin vs.
drawing a card.
The more uncertain
a message is (because
it is being selected
from a larger probability
set), the more information
it contains. Therefore:
information is related
to "entropy,"
the most general phenomenon
in the universe. Weaver's
explanation of the
link between "information"
and "entropy":
pp. 103,
177 (on "entropy," see Wikipedia
article)
The problem of
noise: Indeed,
information is
so general in its
relation to entropy
that even "noise"
seems to be information:
Weaver, pp. 108-109.
So what prevents
the concept of "information"
from thus becoming too
general, so that even
noise is information?
Excerpts from Warren
Weaver, "Recent Contributions to the Mathematical
Theory of Communication" (1949)
The word communication
will be used here in a very broad sense
to include all of the procedures by which
one mind may affect another. This, of
course, involves not only written and
oral speech, but also music, the pictorial
arts, the theatre, the ballet, and in
fact all human behavior. In some connections
it may be desirable to use a still broader
definition of communication, namely, one
which would include the procedures by
means of which one mechanism (say automatic
equipment to track an airplane and to
compute its probable future positions)
affects another mechanism (say a guided
missile chasing this airplane). (p. 95)
The
word information, in this theory,
is used in a special sense that must not
be confused with its ordinary usage. In
particular, information must not
be confused with meaning.
In
fact, two messages, one of which is heavily
loaded with meaning and the other of which
is pure nonsense, can be exactly equivalent,
from the present viewpoint, as regards
information. (p. 99)
The quantity
which uniquely meets the natural requirements
that one sets up for "information"
turns out to be exactly that which is
known in thermodynamics as entropy.
[ . . . ] Thus when
one meets the concept of entropy in communication
theory, he has a right to be rather exciteda
right to suspect that one has hold of
something that may turn out to be basic
and important. That information be measured
by entropy is, after all, natural when
we remember that information, in communication
theory, is associated with the amount
of freedom of choice we have in constructing
messages. Thus for a communication source
one can say, just as he would also say
it of a thermodynamic ensemble, "This
situation is highly organized, it is not
characterized by a large degree of randomness
or of choicethat is to say, the
information (or the entropy) is low."
p. 103)
Remember that
the entropy (or information) associated
with the process which generates messages
or signals is determined by the statistical
character of the processby the various
probabilities for arriving at message
situations and for choosing, when in those
situations the next symbols. The statistical
nature of messages is entirely
determined by the character of the source.
But the statistical character of the signal
as actually transmitted by a channel,
and hence the entropy in the channel,
is determined both by what one attempts
to feed into the channel and by the capabilities
of the channel to handle different signal
situations. [ . . . ]
The best transmitter, in fact, is that
which codes the message in such a way
that the signal has just those optimum
statistical characteristics which are
best suited to the channel to be usedwhich
in fact maximize the signal (or one may
say, the channel) entropy and make it
equal to the capacity C of the
channel. p. 108)
How does
noise affect information? Information
is, we must steadily remember, a measure
of one's freedom of choice in selecting
a message. The greater this freedom of
choice, and hence the greater the information,
the greater is the uncertainty that the
message actually selected is some particular
one. Thus greater freedom of choice, greater
uncertainty, greater information go hand
in hand.
If
noise is introduced, then the received
message contains certain distortions,
certain errors, certain extraneous material,
that would certainly lead one to say that
the received message exhibits, because
of the effects of noise, an increased
uncertainty. But if the uncertainty is
increased, the information is increased,
and this sounds as though the noise were
beneficial!
[ . . . ]
It is thus clear where the joker is in
saying that the received signal has more
information. Some of this information
is spurious and undesirable and has been
introduced via the noise. To get the useful
information in the received signal we
must subtract out this spurious portion.
(pp. 108-109)
The
obvious first remark, and indeed the remark
that carries the major burden of the argument,
is that the mathematical theory is exceedingly
general in its scope, fundamental in the
problems it treats, and of classic simplicity
and power in the results it reaches.
This
is a theory so general that one does not
need to say what kinds of symbols are
being consideredwhether written
letters or words, or musical notes, or
spoken words, or symphonic music,or pictures.
The theory is deep enough so that the
relationships it reveals indiscriminately
apply to all these and to other forms
of communication. This means, of course,
that the theory is sufficiently imaginatively
motivated so that it is dealing with the
real inner core of the communication problemwith
those basic relationships which hold in
general, no matter what special form the
actual case may take. (pp. 114-15)
An engineering
communication theory is just like a very
proper and discreet girl accepting your
telegram. She pays no attention to the
meaning, whether it be sad, or joyous,
or embarrassing. But she must be prepared
to deal with all that come to her desk.
(p. 116)
Suppose
that we were asked to arrange the following
in two categoriesdistance, mass,
electric force, entropy, beauty, melody.
I think there are the strongest grounds
for placing entropy alongside beauty and
melody, and not with the first three.
Entropy is only found when the parts are
viewed in association, and it is by viewing
or hearing the parts in association that
beauty and melody are discerned. All three
are features of arrangement. It is a pregnant
thought that one of these three associates
should be able to figure as a commonplace
quantity of science. The reason why this
stranger can pass itself off among the
aborigines of the physical world is that
it is able to speak their language, viz.,
the language of arithmetic.
I
feel sure that Eddington would have been
willing to include the word meaning
along with beauty and melody; and I suspect
he would have been thrilled to see, in
this theory, that entropy not only speaks
the language of arithmetic; it also speaks
the language of language. (p. 117)
[1] Information
and meaning arises only in the process
of listeners, readers or viewers actively
making sense of what they hear or see.
Meaning is not 'extracted', but constructed.
[2] Linearity
The transmission model fixes and separates
the roles of 'sender' and 'receiver'.
But communication between two people involves
simultaneous 'sending' and 'receiving'
(not only talking, but also 'body language'
and so on). In Shannon and Weaver's model
the source is seen as the active decision-maker
who determines the meaning of the message;
the destination is the passive target.
It is a linear, one-way model, ascribing
a secondary role to the 'receiver',
who is seen as absorbing information.
However, communication is not a one-way
street. Even when we are simply listening
to the radio, reading a book or watching
TV we are far more interpretively active
than we normally realize.
There was no provision in the original
model for feedback (reaction from the
receiver). Feedback enables speakers to
adjust their performance to the needs
and responses of their audience. A 'feedback
loop' was added by later theorists, but
the model remains linear.
[3] Transmission
models treat decoding as a mirror image
of encoding, allowing no room for the
receiver's interpretative frames of reference.
Where the message is recorded in some
form 'senders' may well have little idea
of who the 'receivers' may be (particularly,
of course, in relation to mass communication).
The receiver need not simply accept, but
may alternatively ignore or oppose a message.
We don't all necessarily have to accept
messages which suggest that a particular
political programme is good for us.
[4] In the transmission
model the participants are treated as
isolated individuals. Contemporary communication
theorists treat communication as a shared
social system. We are all social beings,
and our communicative acts cannot be said
to represent the expression of purely
individual thoughts and feelings. Such
thoughts and feelings are socio-culturally
patterned.
[5] In models
such as Shannon and Weaver's no allowance
is made for relationships between people
as communicators (e.g. differences in
power). We frame what is said differently
according to the roles in which we communicate.
If a friend asks you later what you thought
of this lecture you are likely to answer
in a somewhat different way from the way
you might answer the same question from
the undergraduate course director in his
office. The interview is a very good example
of the unequal power relationship in a
communicative situation.
People in society do not all have the
same social roles or the same rights.
And not all meanings are accorded equal
value. It makes a difference whether the
participants are of the same social class,
gender, broad age group or profession.
We need only think of whose meanings prevail
in the doctor's surgery. And, more broadly,
we all know that certain voices 'carry
more authority' than others, and that
in some contexts, 'children are to be
seen and not heard'. The dominant directionality
involved in communication cannot be fixed
in a model but must be related to the
situational distribution of power.
[6] Finally,
the model is indifferent to the
nature of the medium. And yet whether
you speak directly to, write to, or phone
a lover, for instance, can have major
implications for the meaning of your communication.
There are widespread social conventions
about the use of one medium rather than
another for specific purposes. People
also differ in their personal attitudes
to the use of particular media (e.g. word
processed Christmas circulars from friends!).
Furthermore, each medium has technological
features which make it easier to use for
some purposes than for others. Some media
lend themselves to direct feedback more
than others. The medium can affect both
the form and the content of a message.
The medium is therefore not simply 'neutral
' in the process of communication.
[7] Conclusion
In short, the transmissive model is of
little direct value to social science
research into human communication, and
its endurance in popular discussion is
a real liability. Its reductive influence
has implications not only for the commonsense
understanding of communication in general,
but also for specific forms of communication
such as speaking and listening, writing
and reading, watching television and so
on. In education, it represents a similarly
transmissive model of teaching and learning.
And in perception in general, it reflects
the naive 'realist' notion that meanings
exist in the world awaiting only decoding
by the passive spectator. In all these
contexts, such a model underestimates
the creativity of the act of interpretation.
Alternatives to transmissive models of
communication are normally described as
constructivist: such perspectives
acknowledge that meanings are actively
constructed by both initiators and interpreters
rather than simply 'transmitted'. However,
you will find no single, widely-accepted
constructivist model of communication
in a form like that of Shannon and Weaver's
block diagram. This is partly because
those who approach communication from
the constructivist perspective often reject
the very idea of attempting to produce
a formal model of communication. Where
such models are offered, they stress the
centrality of the act of making meaning
and the importance of the socio-cultural
context.
Definitions
of "PCM" and "PPM" (contrasted with
"PAM")
from Microsoft Press Computer
Dictionary, 3rd ed. (Redmond,
Wash.: Microsoft Press, 1997)
PAM:
Pulse Amplitude Modulation.
A method of encoding information
in a signal by varying the amplitude
of pulses. The unmodulated signal
consists of a continuous train of
pulses of constant frequency, duration,
and amplitude. During modulation
the pulse amplitudes are changed
to reflect the information being
encoded.
PCM: Pulse
Code Modulation. A method of
encoding information in a signal
by varying the amplitude of pulses.
Unlike pulse amplitude modulation
(PAM), in which pulse amplitude
can vary continuously, pulse code
modulation limits pulse amplitudes
to several predefined values. Because
the signal is discrete, or digital,
rather than analog, pulse code modulation
is more immune to noise than PAM.
PPM:
Pulse Position Modulation. A
method of encoding information in
a signal by varying the position
of pulses. The unmodulated signal
consists of a continuous train of
pulses of constant frequency, duration,
and amplitude. During modulation
the pulse positions are changed
to reflect the information being
encoded.
References
Albert
Borgmann, Holding On to Reality:
The Nature of Information at the Turn
of the Millennium (Chicago: Univ.
of Chicago Press, 1999)
Jean
Baudrillard, Simulations,
trans. Paul Foss, Paul Patton, and Philip
Beitchman (New York: Semiotext(e), 1983)
James R.
Beniger, The Control Revolution:
Technological and Economic Origins of
the Information Society (Cambridge,
Mass.: Harvard Univ. Press, 1986)
Clifford Geertz,
The Interpretation of Cultures
(New York: Basic, 1973), Chap. 1, "Thick
Description: Toward an Intepretive Theory
of Culture," Chap. 15, "Deep Play: Notes
on the Balinese Cockfight"
A. J. Greimas,
Structural Semantics: An Attempt
at a Method, trans. Daniele McDowall
et. al. (Lincoln: Univ. of Nebraska
Press, 1983)
On cryptography
and early computing during WW II:
Simon Singh, The Code
Book: The Evolution of Secrecy from
Mary, Queen of Scots to Quantum
Cryptography (New York: Doubleday,
1999)
Neal Stephenson, Cryptonomicon
(New York: Avon, 1999)