Logo: University of Southern California

An Information Theorist Visits the Viterbi School — and the Brain

In living computers, the system is the code, says signals expert

March 13, 2006 —
Toby Berger: the system is the code
 What's a gifted electrical engineer whose career has been devoted to such Claude Shannon-centric topics as video compression, signature verification, and coherent signal processing doing deep inside the brain?
 
Delivering the sixth Viterbi centennial lecture, which was also the fourth annual Viterbi Lecture, Thursday March 9, Toby Berger reported on the results of seven years of investigation (and speculation) on how neurons communicate with themselves and the outside world, bringing an IT focus to an old set of enigmas. The result was an intriguing mix of insight and mystery, of truly remarkable facts (Every human brain continuously carries message traffic as intense as the entire Internet.) and elegant inference that left a feeling of awe at how enormously much remains to be discovered.

Berger recently moved to the University of Virginia after a long and remarkably successful career at Cornell, which won him, among many other honors election to membership in the National Academy of Engineering this year.

His brain work focuses on communication, his specialty, but in an "intraorganism" context, cells talking to cells. The key element of this, in the context of Shannon's work, is communication without coding.

Coding is clearly present in some biological systems, Berger noted. DNA is obviously a code -- but, as Berger pointed out, it serves primarily as an interorganism code, a way to transmit characteristics from on organism to another.

But for the brain's intraorganism communication, coding is absent. The channels evolved to become completely specific to messages. The system, from another perspective, becomes the code.

Shannon optimal performance, he noted (and proved, in equations using basic Shannon understandings) is possible without coding under these circumstances, "optimally matched over a wide range of power consumption levels."

The matching establishes channel capacity in a unique way. The cost of in joules per bit of transmitting a message at low rates is low, but goes up rapidly as the volume of bits increase: the faster you talk, the more each word costs, setting the limits.  

But what are the messages, and where are the channels? Berger enthusiastically plunged into basic neuroanatomy. A human brain contains about 1011  neurons. Each neuron is connected to about 1000 other neurons, creating the staggering total of 1015  active interconnections.

Viterbi lecturer and Andrew Viterbi
The neurons cycle at a rate of about 2.5 microseconds, with a non-firing being as a much of a message as an energization. If you add up the activity, Berger noted, "the brain is distributing impulses to its cells at a rate greater than the Internet."

But the information processing patterns are vastly different than electronic computers. There is no central clock, no sharply synchronized dance of crisp ones and zeros. Each neuron fires according to its own clock, and the pulse and its transmission along the nerve axons stretch out over time, so that the signals' leading edge doesn't get to the farthest neuron, among the 1000 or so that are receiving the signal, until after the tailing edge has  reached the nearest.

According to Berger, signal processing considerations dictate some conclusions about what is going on in this intense bustle of activity:
  • The message carried in an individual nerve impulse is independent of the amplitude of the impulse. Rather, the message comes in the temporal pattern of successive pulses.
  • This pattern in turn is determined by dynamic adjustment of the threshold of the firing neurons -- that is, the minimum input impulse it takes to fire a neuron. If that threshold is determined and inflexible, according to Berger, the system can't function
  • But some messages (e.g., reflexes) have to be acted on too quickly for temporal coding to work.
  • Messages aren't just repeated in the succession of energizations of corresponding neurons, spreading to a wider area of the brain. Rather, they change as they pass through successive neurons.
  • Neurons need a certain minimum input of excitation to remain alive and functional,

The central mystery remains how this immense collection of simple, standard individuals by elusive laws becomes a network that can process information. Berger noted its development for some clues. Almost all the brain cells are formed between the second trimester of pregnancy and age one. All the synapses are in place by age two, and their arrangement doesn't change much subsequently, though some rewiring takes place.

And, according to Berger, it's clear that the brain is a Markov chain -- a system in which the route that took it to a given state is irrelevant to determining where it's going: all that counts is the current state.  
Dean Yannis Yortsos congratulates Berger post-lecture

Berger emphasized one crucial aspect of the system: its longevity and adaptiveness. The elements had come together over millions of years tuning themselves to each other.  Almost wistfully, he recalled asking his collaborator on much of his brain research, W, B, Chip Levy of the University of Virginia medical school, "why organisms haven't evolved using silicon for information."  Levy told him, he related, that a child falling on its head (not very hard) "might stretch its nerve cells by 10 percent" from the impact.

In closing, he offered a Markov epigraph from Claude Shannon: "we have knowledge of the past, but we can't control it. We can control the future, but we have no knowledge of it."