Nanocarbon modeling may be the next step toward emulating human brain function. That’s the focus of USC electrical engineering professor Alice Parker’s “synthetic cortex” study, featured this week by the National Science Foundation, which funded the three-year study.
EE Professor Alice Parker (left seated) discusses animation of a synaptic connector with graduate students Chih-Chieh Hsu (center standing) and Jonathan Joshi (right foreground).
Parker and co-principal investigator Chongwu Zhou, both of the USC Viterbi School’s Ming Hsieh Department of Electrical Engineering, have teamed up on the “BioRC (Biomimetic Real Time Cortex) Project,” a goal of which is to create nanocarbon brain neurons that can talk to each other. The research team includes Viterbi School electrical engineering graduate students Jonathan Joshi, Chih-Chieh Hsu, Adi Azar, Matthew Walker, Ko-Chung Tseng, Ben Raskob, Chuan Wang, Yoon Sik Cho, Changsoo Jeong and Jason Mahvash.
The team is studying the behavior of cortical neurons, what makes them fire and send signals through synaptic connectors to other neurons in the human cortex, as well as neurons' "plasticity," or ability to learn and remember.
Each time a neuron fires, it sends an electro-chemical spark through thousands of other neurons at speeds of up to 200 miles per hour. But with approximately 100 billion neurons in the human cortex, and approximately 60 trillion synaptic connections, the brain is massively interconnected, Parker says. That makes the task of unraveling a neuron’s electrical circuitry quite complicated.
“The brain is kind of like a biochemical factory, operating in a sphere that you can’t stretch out on integrated circuits and circuit boards in order to emulate all of its electrical activity,” Parker says. “The connectivity is too great and too many delays are introduced. We had to turn to nanotechnology to build something three dimensionally, so that eventually we'll be able to emulate how the neurons fire and activate others along a specific path within that sphere.”
“This is a big departure from some previous synthetic brain projects, which attempted to emulate neural behavior with electrical signals using conventional multiprocessors,” says Joshi, who has engineered the circuit design for artificial synapses that learn. “Nanocarbon modeling solves problems such as the sheer physical size of building a synthetic cortex and the cost of expensive electronics to power it, since the brain never shuts off."
In fact, until quite recently, Parker adds, the size and cost of available electronics made construction of complex brain-like structures totally impractical.
The team has already designed and simulated the transistor circuits for a single synapse, and a CMOS chip that will be used to validate the concepts is about to be fabricated, says Hsu, a senior member of the team and Ph.D. student in electrical engineering. Now it’s time to connect the structure to another synapse and study neural interconnectivity. By the end of the semester, she hopes to have “several synthetic neurons talking to each other.”
Ultimately, the researchers hope to answer one question: Will science ever be able to construct an artificial brain of reasonable size and cost that exhibits almost real-time behavior?
NSF image: Synthetic brain. An animated artist's conception of a carbon nanotube synapse. The orange nanotubes are PMOS transistors and the green nanotubes are NMOS transistors. The red features are metallic interconnections and transistor gates, and the blue features are metallic interconnections.
A lot is riding on it, she adds. Autonomous vehicle navigation, identity determination, robotic manufacturing, and medical diagnostics are all engineering challenges that could benefit from technological solutions that involve artificial neural structures.
And in medicine, the stakes are even higher.
“Researchers have already built experimental cochlear implants that are able to restore some hearing in the deaf and new vision systems that can restore some sight to the blind, but what we’re working on now is what you’ll see 30 years in the future,” Parker says. “This is work that could revolutionize neural prosthetics, for one thing, and give us some pretty amazing biomimetic devices.”
For more information about the “BioRC Project,” visit Parker's departmental web site. The project also involves collaboration with Kang Wang, Alex Khitun and Mary Eshaghian-Wilner at UCLA, and Philip Wong and Jie Deng (now at IBM) at Stanford University, as well as neuroscience faculty at USC.