Path: utzoo!utgpu!jarvis.csri.toronto.edu!rutgers!aramis.rutgers.edu!athos.rutgers.edu!nanotech From: macleod@drivax.UUCP (MacLeod) Newsgroups: sci.nanotech Subject: Re: Intractability of active-shield testing Message-ID: Date: 22 Jun 89 03:02:07 GMT Sender: nanotech@athos.rutgers.edu Organization: Digital Research, Monterey, CA Lines: 50 Approved: nanotech@aramis.rutgers.edu In article dmo@turkey.philips.com (Dan Offutt) writes: >These remarks apply to designs in general, and nanomachine designs in >particular. Nanomachines are likely to be more complex than >present-day machines (holding size constant). In general, the more >complex the machine, the more difficult it will be to predict its >interaction with the environment to which it must be fit. I would sleep better if all engineers, of every discipline, read a slender volume called "Systemantics" by a medical doctor named John Gall. It is a short, humerous series of essays exploring a number of empirically derived axioms about system behavior. Like its predecessor, "The Peter Principle", it is actually profound truth wrapped in humor. Gall shows that as system complexity grows the possibilities - and likelihood - of anomalous behavior increases, presumably as some function of the number of machine states. The larger the system, the more it tends to impede its own functioning. The examples Gall cites as climax designs often perversely generate exactly the problem they were originally designed to surmount - the classic example is the mammoth VAB at Cape Canaveral. Built to protect Saturn V components from the weather, it generates its own rain internally. Michael Sloan MacLeod (amdahl!drivax!macleod) [This is often called the "law of unintended effect" and applies to almost any complex system, not just engineered mechanisms. Indeed, it applies less to engineered machines than to most other complex systems. The VAB really does protect rockets from the strong winds that are common on the Florida coastline. However, purchasing departments and their regulations typically cause organizations to spend twice as much for what they buy. Expanded legal liability for manufacturers and doctors cause talented people to leave the field, and safety oriented products and medicines to be withdrawn. The more complex something is, the greater the chance its design and production will be done by committee and bureaucracy. This is the major reason for the more-than-linear decrease in reliability and effectiveness with size. There is some reason to hope that for engineered machines, AI systems will have their biggest impact simply by letting bigger projects be handled by a single individual. Furthermore, I'll wager that the first corporation to replace its *management* with a computer program will wipe up the competition in no time flat. Of course, as I have noted here before, there are some dangers inherent with trying the same thing with the government... --JoSH]