logo
Wrong email address or username
Wrong email address or username
Incorrect verification code
An Introduction to Information Theory: Symbols, Signals and Noise - John Robinson Pierce
An Introduction to Information Theory: Symbols, Signals and Noise
by: (author)
5.00 10
Covers encoding and binary digits, entropy, language and meaning, efficient encoding and the noisy channel, and explores ways in which information theory relates to physics, cybernetics, psychology, and art. "Uncommonly good...the most satisfying discussion to be found." — Scientific American.... show more
Covers encoding and binary digits, entropy, language and meaning, efficient encoding and the noisy channel, and explores ways in which information theory relates to physics, cybernetics, psychology, and art. "Uncommonly good...the most satisfying discussion to be found." — Scientific American. 1980 edition.
show less
Format: paperback
ISBN: 9780486240619 (0486240614)
Publisher: Dover Publications
Pages no: 336
Edition language: English
Bookstores:
Community Reviews
Musings/Träumereien/Devaneios
Musings/Träumereien/Devaneios rated it
5.0 { s(Q|X) = -K SUM(p*ln(p)) }: "Symbols, Signals and Noise" by J. R. Pierce
(Original Review, 1980-12-05)Final answer to question, "How many joules to send a bit?"The unit of information is determined by the choice of the arbitrary scale factor K in Shannon's entropy formula:{ s(Q|X) = -K SUM(p*ln(p)) }If K is made equal to 1/ln(2), then S is said to be measured in "bits" o...
Books by John Robinson Pierce
On shelves
Share this Book
Need help?