Path: utzoo!censor!geac!torsqnt!news-server.csri.toronto.edu!cs.utexas.edu!usc!zaphod.mps.ohio-state.edu!rpi!uupsi!njin!paul.rutgers.edu!aramis.rutgers.edu!athos.rutgers.edu!nanotech From: erich@eecs.cs.pdx.edu (Erich Stefan Boleyn) Newsgroups: sci.nanotech Subject: Re: Nanotech Economy Message-ID: Date: 5 Dec 90 03:12:19 GMT Sender: nanotech@athos.rutgers.edu Lines: 96 Approved: nanotech@aramis.rutgers.edu I haven't followed 'sci.nanotech' too closely, but I have been thinking about similar ideas on my own... (at least ones on this thread) panix!alexis@cmcl2.nyu.edu (Alexis Rosen) writes: >fear technology. But soon enough this won't matter. Joe average will self- >evolve into a creature which may or may not have any physical resmblance to [deleted] >my eyesight and bad back :-) is grow my brain capacity and speed by at least >six orders of magnitude. (Working from rough numbers of what should be >possible, based on EoC and what I've read here.) There's too much to learn >to stay biologically _stupid_, which is what we all are. (Now it gets interesting... ;-) >1) I'm still not convinced we can overcome the graygu problem. I'm also not >convinced that anyone understands the subtleties enough to say anything >intelligent about this. So it's not worth more than a passing mention. I don't know what to do about this problem either... but I've been attacking it from the other end, so I haven't considered it much. >2) More importantly, I'm guilty myself (in the above paragraphs) of the same >thing I accused Daniel of- overly limited vision. Like the graygu problem, >though, I don't see how we can even approach this subject intelligently. When >you're a million times smarter than you are today, what will be important to >you? Will creativity still be a mystery? Will key "human" things, basic >things like material and emotional desires, still have meaning? The point >is, achieving "real" nanotech means that you've pretty much won the game >of life, as we know it. Since I'm not Mr. Spock (I don't even play him on >TV :-) I'm not going to hazard any guesses about life as we _don't_ know it. Well, I think a better question to ask would be how would you "expand" your brain in size? And *why* would you do it? Before you answer me with flames (;-) I'd like to say that as a student of neuroscience and genetics, I've been thinking about this for a while, and the brain is so specifically wired (so are the developmental mechanisms for that matter) that just turning on the (possibly mutated by this time in your life-span) developmental mechanisms would likely be *very* dangerous to your life, not to mention they may not work at all. Giving the nanomachines "programs" for creating new neural tissue may be your answer, but who knows enough to create a "new wiring system" for a human brain? Just adding on tissue would be grossly inefficient and likely stupid. Most of the greatest works in history have been a collecting (usually by a single individual or couple of people) of other work done. That's great so far, but when talking of a project of this magnitude?!? I don't know if even the supposed "super-geniuses" of our time could handle it (even given the information). "Intelligence" as we know it is just one aspect of a "mind", as people have pointed out in this discussion. I don't know if they would be that much different (the basis at least) by *accident*, which is the method that seems to be being used most here for forced increases in intelligence. By design, I say when (and if ;-) we get it in our heads to change our heads, we should carefully think out what we want? Greater learning potential is one thing, but there are other possibilities besides a linear increase in our "intelligence", as some of you have pointed out. The bases for emotion all have functions, and so we should think about what kind of societal/personal role we want this to take. In all of this should be taken into account the ideas of who would get this kind of thing... everyone?... a select few? or what? I have tentatively been working on ideas for life-extension projects for a while now (that's part of the reason for the neuroscience and genetics), and have realized that the idea of improving intelligence would be almost a must (someone would likely get terribly bored otherwise ;-), and that another bodily form might also be necessary, etc. I wish I had better answers to these questions, but I'm glad others are thinking about them too. It is looking like these kind of thoughts in motivated and widely interested people are becoming more common (and for that matter, there seem to be more highly motivated and widely interested people, but I think that's a fallacy, since this medium just allows those of us who have the motivation to communicate common interests to each other). >Here's a truly dismal thought: What happens to an organism which has no >challenges to overcome? You got me... but I think that as a primitive answer (using little of the insights above) look at the olympian gods (if just more intelligence and lifespan, that is). You could create challenges (of understanding), and make that the societal norm, again assuming the same emotional bias. Erich P.S.: I think I am guilty of the limited vision too, but I'm working at it! / Erich Stefan Boleyn Internet E-mail: \ >--={ Portland State University Honorary Graduate Student (Math) }=--< \ College of Liberal Arts & Sciences *Mad Genius wanna-be* / "I haven't lost my mind; I know exactly where I left it." Brought to you by Super Global Mega Corp .com