Path: utzoo!attcan!uunet!zephyr.ens.tek.com!uw-beaver!mit-eddie!rutgers!netnews.upenn.edu!eniac.seas.upenn.edu!sklarew From: sklarew@eniac.seas.upenn.edu (Dann Sklarew) Newsgroups: comp.ai Subject: Re: How much info can the brain hold? Message-ID: <33870@netnews.upenn.edu> Date: 2 Dec 90 03:33:47 GMT References: <11941@hubcap.clemson.edu> <7492@hub.ucsb.edu> Sender: news@netnews.upenn.edu Reply-To: sklarew@eniac.seas.upenn.edu (Dann Sklarew) Organization: University of Pennsylvania Lines: 41 In article <7492@hub.ucsb.edu> 6600dt@ucsbuxa.ucsb.edu (Dave Goggin) writes: >In article <11941@hubcap.clemson.edu> svissag@hubcap.clemson.edu (Steve L Vissage II) writes: > >>I've heard estimates of how many neurons the human brain contains, somewhere >>in the trillions, I believe. Has there ever been a reliable estimate of >>how much information, in bits or other computer-relevant units, can be >>contained in that structure? >> > >I'd follow up with another questin of brain >computer comparison. It is known that much of the >brain's power comes ffom the high degree of >paraallel processing involved. What is the speed >(in MHz, or other units) that the brin runs at, and >how does it vary with state. Also, how does this >compare witht existing parrallel-processed hardware? > Computational neuroscientist Terrence J. Sejnowski (Salk Institute) has cited the best current estimate of brain complexity as 10exp14 synapses: "If we assume that synapses are sites of information storage, then we can make a rough estimate for the total information stored in the brain (given that each synapse stores only a few bits). . . around 10exp14 bits." Given an activation rate of 10 synapses per second, he states that the brain must be performing at least 10exp15 operations per second. This is a full five orders of magnitude above the capacity of the Connection Machine, one of the largest of today's highly paral- lel computers. Thus, beyond the total number of neurons in the brain (10exp12 according to neurobiologist Eric Kandel), it is this intricate synaptic connectivity that allows the brain to out-perform advanced VLSI technologies. Ref: Sejnowski, Terrence. 1989. In "The Computer and The Brain: Perspectives on Human and Artificial Intelligence," edited by Jean R. Brink and C. Roland Haden. New York: Elsevier. Includes a figure of the logarithm number of elementary operations per second by the largest digital computers plotted as a function of time (a line estimated to reach aforementioned brain capability by 2020). Brought to you by Super Global Mega Corp .com