| |||
Information Theory and Artby Jim Bumgardnerpart 1 of 3 I have found that Information Theory provides a very useful metaphorical framework for understanding human perception of art. When we say things like:"That picture is too noisy" "The room is too cluttered" "This song is boring" "That script has good pacing" "Put some reverb on that mic" We are unintentionally invoking the spirit of Claude Shannon, who first described Information Theory in the late 1940s, at the dawn of the information age. Many artists I talk to are unaware of the existence of Information Theory, yet they unconsciously dabble in it all the time. Sadly, Information Theory is normally taught only to technical people and not to artists. Having attended art school myself, I eventually stumbled across Shannon's work some years afterwards as I taught myself the principles of computer programming. I found Shannon's work to be profoundly relevant to my own sensory experience, and a useful guiding factor in much of the computer- generated art and music I have produced. OVERVIEW OF INFORMATION THEORYIn his 1948 paper, "A Mathematical Model of Communication (pdf)", Shannon describes an idealized model, or framework for understanding communication. This model involves a sender (what Shannon calls an "Information Source"), a listener (what Shannon calls the "Destination") and a channel thru which the sender communicates with the listener. Shannon provides both analog and discrete (digital) versions of this model. In the digital model, the sender sends information, in the form of discrete symbols, thru the channel to the listener. In the analog model, the sender sends a continuous signal. For the examples in this part of the article, I'll use the digital model.
The sender encodes the information using a transmitter. The listener decodes it using a receiver. Shannon worked for the phone company, so an easy way to think of the transmitter and receiver is the mouthpiece and earpiece on a telephone. Shannon provides precise formulae for quantifying how successfully the information the sender wishes to send can be retrieved by the listener. Two things prevent the listener from getting the information: 1) All communications channels have a limited capacity for passing information. This capacity is called the bandwidth of the channel. Channels with a higher bandwidth can send more information over a shorter period of time. If the bandwidth is not significantly high, the information can not be sent in an allotted period of item. 2) There is a distinct possibility for errors (in the form of noise) to be introduced into the stream of information as it passes thru the channel, before it gets to the transmitter. This process is shown in the illustration, above, from Shannon's paper. In Shannon's paper he provides a way to precisely measure the unique information in a stream of information, and calls this measurement entropy. Entropy describes the amount of change in the signal. A stream of symbols that is all the same symbol contains no useful information, and has an entropy of zero. A A A A A A A A A A A A A A A A A A A If there is useful information in that stream, then there will be some changes in those symbols. A L O V E L Y P I E C E O F P I E If the stream is packed with information, then there will be a lot of changes and the stream will have a high level of entropy. A L P O P I A T I W S T P G A G M S Although entropy does not necessarily correspond to useful information, we can imagine it to correspond in a loose way. Signals with no entropy cannot have information in them. Signals with some information must have at least a corresponding amount of entropy, or greater. part: 1 2 3 (next) | |||
|
table of content
per se and
![]() |
|||
Copyright © 2005-2015, Pelekinesis. :: Some rights reserved. :: Powered by Humonergy, Inc.