Path: utzoo!utgpu!jarvis.csri.toronto.edu!neat.cs.toronto.edu!rayan Newsgroups: comp.arch From: rayan@cs.toronto.edu (Rayan Zachariassen) Subject: fad computing Message-ID: <89Nov25.051946est.2233@neat.cs.toronto.edu> References: <1128@m3.mfci.UUCP> <1989Nov22.175128.24910@ico.isc.com> <3893@scolex.sco.COM> <39361@lll-winken.LLNL.GOV> <17305@netnews.upenn.edu> <1989Nov25.000120.18261@world.std.com> Distribution: usa Date: 25 Nov 89 10:20:23 GMT [ This article mentions no chips, no brands, no benchmarks. It is about fuzzy issues as viewed from a systems management perspective. It is also 190 lines... ] Like Barry says, hardware technology is NOT the issue at this stage. The issues (that I can remember now :-) are: Education. Vendors have a tendency to ``forget'' to tell customers about the other half of the cost of the machine they're selling: support. The way it should be is to buy a slave (aka sysprog) along with any machine you buy, but several things conspire against awareness of this: - Many machines are sold to people that run one application, which could be the O.S. of the machine for all they care. This means salescritters talk to a lot of people who really just want a turnkey black box. This means they don't keep the support issue in mind when speaking to customers in other kinds of ships, and certainly never bring it up... "But Mr. Jones, you realize that with 10000$ to spend you really should use a few thousand on proper system support, but unfortunately we don't have anything you could buy for less than 9000$. For your own good, Goodbye". No way. - PCs (more "turnkey" machines) have given people the impression that they can run their own box and that keeping a system running is something that can be relegated to (literally, at times) slave labour. - Infrastructure is invisible. People have to be reminded that it exists, that it is necessary, and to cooperate with it. Think about your relationship with your Government and its Tax authority, then scale that down to your local shop. Does the thought give you a warm fuzzy feeling that your money is being well spent on necessities? Probably not. Psychology. People don't like other people spending their money. In particular when they feel they have no authority over the process, or if they, personally, don't get special attention for their needs when they want it. It feels like paying something for nothing, if all you're paying for is infrastructure. They start viewing the infrastructure provider as a "them" in an arms-length relationship and look for ways of improving the service they get. This typically means channelling $$ from large infrastructure to local equipment/labour, which undermines the infrastructure support and worsens the situation for everyone dependent on it. Viscious circle. Most of such money is of course spent on equipment. This was the precendent of the vax-class-minis to sun-class-workstations movement of a few years ago; people were moving from 1mips shared computing to 1.5mips on their desk, preciously guarded (ITS MINE! ALL MINE! GET YOUR MITTS OFF MY CPU!!!). The effect was to go from largish empires that wouldn't or couldn't move quickly (largely due to lack of mini-class products at the time, later also due to the size of capital investment required), to small fiefdoms. But people thought they were happy, and it *is* their money after all. So, they bought workstations. They didn't need workstations. (I'm making sweeping statements here to make a point, I'm quite aware there are exceptions). And still, today, most people don't need a "high-performance micro" on their desk; what they need is to get a job done in a productive (computer) work environment. Nowadays that reads "bitmapped screen, window system, good response time". If your job requires high-bandwidth communication to where you physically sit to do your work, then maybe having your own CPU is a reasonable way of fulfilling that requirement, but for NO OTHER REASON. Nevertheless, workstations were bought in droves for all the wrong reasons: status symbol, use-it-or-lose-it funding, and "wow! a computer on my desk; neato, now I can ignore you!". The end result is a LOT of compute cycles being spent in idle loops. Economy. What happens if you try to fulfill the basic requirements using the Display Station approach (the sole purpose of the computer on your desk is to manage the interface to you and compute-cheap tasks), instead of the Work Station approach (your desktop computer does everything)? The first given then is that you save a lot of money by providing fancy terminals instead of computers on people's desks. To maintain the productivity requirement invariant, your pour this money into centralized resources: CPU, I/O, Printers, etc. Try a rough calculation with today's prices: a display station is about half the price of a workstation, say. For a 5k$ workstation you get 2.5k$ to beef up the central facilities if you buy a terminal instead. Multiply this by the number of workstations at a largish site. Now add this amount onto the existing infrastructure funds, and as a result you get a significant increase in shared resources. The advantages include: - there are no private resources to waste. - each user has the potential to use 100% of the shared resources when they need to. The disadvantage that is always brought up as a counter argument is ROBUSTNESS. Well, surprise surprise, a distributed environment is just as fragile as a centralized one with the same functionality (that's the consensus around here after years of observation), but it is MUCH more complex. Centralized environments can be made very robust if well thought out. In addition to the economy of resource scheduling, there are the usual economies of scale inherent in an un-fragmented community. One might actually be able to afford decent I/O subsystems which cost order-of-magnitude of the main CPU. These are all the more important in the context of this kind of timesharing environment, because of the timesharing load AND because there will always be people that would be I/O bound on the workstation they could afford for their desk (e.g. when it starts thrashing due to a large, theoretically compute-bound, job). As long as the shared CPU isn't idle waiting for I/O, it is effective and efficient computing as far as system management is concerned. If people have special needs, they buy the capability (a disk, another cpu, whatever), and hand it to the shared facility to make it available to themselves. "If you want to use 600MB online, then either pay for file storage or give us a disk and you'll get that space". The important part is that the disk sits with all the other disks, and therefore receives better attention than if it was on your desk -- A/C machine room, ease of service, bits are bits so other file space can be found if your disk died and it is important enough, competent management and monitoring, etc. If I were in that kind of environment (and I am :-), I wouldn't care about which resources were being used as long as they are 1) paid for, and 2) easy to manage. Often 2) translates into a variation of "how close is the problem to where I am now?". System staff hates running around all over the place, both physically and online. Quality of Service. Done right, central facilities can give users a nice warm fuzzy good feeling. I think the trick is to give them on-site personal attention, which is where centralization usually falls short. There are solutions to that; the one we use here is to tell people to hire their own support people (aka slave labour in the University parlance). They then act as liaisons between the shared facility and the specific users they interact with and perhaps as application specialists for the entire facility. They do no infrastructure-type support. The result is no duplication of work, and good service. But, as Barry says, "who cares" [which scheme people choose]. Certainly I wouldn't, except for collateral effects on the shared facility and hence the other users. In this situation, people really do get what they deserve. Note that none of my comments said anything about central authority, only central resources and resource-management. There is a large psychological difference between the two concepts. Immature Software. To come full circle: this whole problem is caused by the nature of today's computers, their software to be precise. Each box is self-contained and thinks it owns the world. This is the wrong premise and I think we'll need a second UNIX-like (r)evolution to make people realize this. I'd like to see a situation where people buy 1) the interface hardware they need (capital cost) 2) a chunk of computing resources (capital cost) 3) system support (software/environment maintenance) The hardware is tangible; it would sit on their desk and connect to the communications infrastructure. Most people need a fancy terminal, and should not have to change their equipment unless the interface technology changes and they NEED the new capabilities (e.g. bw->colour, Audio I/O, 3D digitizer, Holographic displays). Deprecated Sun3/50s will work fine if you just want a screen,mouse,keyboard combination. (Actually, the way OS's are going, running the interface is about all they can do anyway :-). The chunk of resources is a capital cost contribution to a large shared "system", to be allocated within that system according to global needs. That system in turn is tiered appropriately, using older slower technology (which is pretty fast these days) to control non-compute resources. These days the depreciation period for hardware is 3 years, which means there is a lot of cheap capable computers out there, or there will be 3 years from now ;-). In a good system, they would be a reusable resource. In large environments, the workstation concept is a fad. rayan Artificial Intelligence/Numerical Analysis/Computational Theory Shared Computing Facility University of Toronto Brought to you by Super Global Mega Corp .com