Logo: University of Southern California

"To Get More Data, Add Noise"

A Viterbi graduate student co-authors a paper explaining how noise can power processes

October 26, 2011 —

For years, Bart Kosko of the University of Southern California Viterbi School of Engineering has been following a research path that seems like a contradiction in terms: using noise to improve signal and information processing.

Koskofranzke 1
Doctoral candidate Brandon Franzke: a theorem proving that the right amount of noise can speed convergence processes.
A forthcoming paper co-authored with his doctoral student Brandon Franzke in the American Physical Society's Physical Review E is the latest and perhaps the largest step on this path. The result, says Franzke, could be ways to speed up the process of biological evolution in preferred directions, with other possible applications in other areas including improvement of a search engine.

In information theory "noise" is not unwanted sound, but a general term for unstructured energy that overlaps the energy forms a system uses to structure information.

It has long been known that some nonlinear systems, including mechanisms found in living cells, function better and faster with noise.

"The noise can be white noise," wrote Kosko in his 2006 book “Noise,” "or colored noise or chaos or any other energetic disturbance that interferes with some signal of interest present in the system."

The new paper offers a mathematical understanding of how this process works in a set of systems not previously known to have the capability to use noise power.

Markov chains are processes that move probabilistically from state to state but are nearly memoryless —moving to a new state depends only on the current system state regardless of its past, just as the next move on a chess board depends only on the board's current configuration.

Markov chains describe many important processes from physics, biology and economics. These systems still change state indefinitely in a type of random walk, but the Markov chain eventually settles down to an equilibrium probability distribution that describes, in the long run, how frequently the system visits any given state.

The paper, entitled “Noise Can Speed Convergence in Markov Chains,” presents a new result. Such convergence systems "benefit from adding small amounts of noise. Too little noise produces little or no benefit while too much noise can swamp the system's performance."

Most of the paper is a systematic theoretical presentation of the new understanding, the “Markov Chain Noise Benefit Theorem," mapping the rules of the noise benefit game in a number of important Markov systems adding small amounts of noise.  

Professor Bart Kosko, author of the 2006 book "Noise."
The paper shows how adding noise to a Markov chain takes each step closer toward its equilibrium than without noise. The system is still on the same road to the same set destination, but with noise the system makes fewer false steps. The paper notes that the number of steps in the equilibrium is a factor: the more steps, the greater the benefits of enough noise. As Kosko puts it, "All Markov roads still lead to Rome, but with the right amount of noise it takes fewer steps to get to Rome."

The paper successfully applies the theory to observed results in three well-known convergent systems. “The first model is a two-parameter Ehrenfest diffusion model that shows how noise benefits can occur in the class of birth-death processes. The second model is a Wright-Fisher model of genotype drift in population genetics. The third model is a chemical reaction network of zeolite crystallization.”

The implications are not just theoretical. The researchers say the finding shows immediate promise for applications to a number of areas including genetics, physics, chemistry and Monte Carlo estimation in statistics. For example, said Kosko, the insight may help speed the patented search algorithm that Google uses to create its search index, since that technology involves a Markov chain for random web surfers.

Kosko is a professor in the Viterbi School's Ming Hsieh Department of Electrical Engineering and a professor of engineering and law in USC's Gould School of Law. He is a fellow of the IEEE, of the International Neural Network Society and of the International Fuzzy Systems Association.