Path: utzoo!utgpu!news-server.csri.toronto.edu!cs.utexas.edu!swrinde!zaphod.mps.ohio-state.edu!cis.ohio-state.edu!ucbvax!agate!stanford.edu!unix!ctnews!pyramid!athertn!hemlock!mcgregor From: mcgregor@hemlock.Atherton.COM (Scott McGregor) Newsgroups: comp.human-factors Subject: Re: adaptive user interfaces Message-ID: <35509@athertn.Atherton.COM> Date: 17 Jun 91 22:04:08 GMT References: <1991Jun16.213531.8517@watdragon.waterloo.edu> <1991Jun12.182221.10179@cs.sfu.ca> Sender: news@athertn.Atherton.COM Reply-To: mcgregor@hemlock.Atherton.COM (Scott McGregor) Organization: Atherton Technology -- Sunnyvale, CA Lines: 26 In article <1991Jun16.213531.8517@watdragon.waterloo.edu>, >sasingh@rose.waterloo.edu (Sanjay Singh) writes: >>I think user interfaces could be made more adaptive if some artificial >>intelligence techniques were used to make using them more intuitive. >I am still new to AI, but I think natural language understanding and >neural nets in VLSI could provide promise for making computers do what we >mean rather than what we say. >I believe Xerox PARC has been working on some 3-d type of GUI. It was on >the cover of Byte some time back. You might be interested in the articles about software that anticipate what you want to do, and adapt based upon past behavior. I wrote such an article on Prescient Agents for the HP Professional (May 1990), and there was an interesting article on EAGER in the most recent SIGCHI proceedings. Neural nets, and natural language understanding aren't necessary to provide an improved intuitive interface--merely paying attention to what people are doing and predicting what they will do can offer a more adaptive, intuitive, "Radar O'Reilley" anticipatory type interface. No doubt some of these interfaces will be more attractive than others. --Scott McGregor Atherton Technology mcgregor@atherton.com