Path: utzoo!attcan!utgpu!jarvis.csri.toronto.edu!cs.utexas.edu!usc!ucsd!ucsdhub!hp-sdd!ncr-sd!ncrcae!hubcap!gene From: gene@cs.bu.edu (Gene Itkis) Newsgroups: comp.parallel Subject: Re: scalability of n-cubes, meshes (was: IPSC Communications) Message-ID: <7274@hubcap.clemson.edu> Date: 1 Dec 89 13:54:58 GMT Sender: fpst@hubcap.clemson.edu Lines: 33 Approved: parallel@hubcap.clemson.edu In article <7178@hubcap.clemson.edu> wilson@carcoar.Stanford.EDU (Paul Wilson) writes: >My admittedly naive intuitions would say that only meshes are truly >scalable, since you have to pack things into real (<= 3D) space. >... >It would *seem* to me that a 3D mesh is the only way to go >because that's the highest dimensionality you can embed into >a 3D reality. You get constant time per hop, no problem. One problem with 3D mesh is that as the processors work the heat is generated(*). Obviously this heat has to be dissipated somehow. The heat created is proportional to the volume of the mesh, and the dissipation can happen only proportionally to the surface area. So, if you care about scalability, the 3D meshes are no good - they'll overheat, cook themselves. Thus the best you can hope for in our physics is 2D mesh. >... >Hypercubes end up needing long wires to project a higher-dimensional >graph into 2- or 3-space. As processor speeds increase (and the >speed of light presumably doesn't) these end up being slower >than other links and destroy the scalability of n-cubes. A somewhat relevant issues are considered in a FOCS-89 paper "Power of Fast VLSI Models is Insensitive to Wires' Thinness". As one of the consequences of the result, if the communication time (of the length x wire) is, say, $x log^(1.1) x$ then designers can design their chips using "imaginary" wires of width 0, and the chips so designed can be simulated by a mesh of finite processors on-line without increase in space or time. -------------------- (*) Some people doubt that energy dissipation is necessary in reversible computations. But non-dissipating gates have not been discovered yet. In addition, some computations, e.g. error-correcting, must be irreversible, and therefore will be sure to produce heat. So, the assumption that heat has to be generated by computing elements seems to be safe, at least for a while. Brought to you by Super Global Mega Corp .com