Newsgroups: comp.archives Path: utzoo!utgpu!news-server.csri.toronto.edu!rpi!zaphod.mps.ohio-state.edu!caen!ox.com!msen.com!emv From: moeller@kiti.informatik.uni-bonn.de Subject: [fj.mail-lists.connectionist] [Knut Moeller: TR available from neuroprose; learning algorithms] Message-ID: <1991Jun16.174707.25642@ox.com> Followup-To: fj.mail-lists.connectionist Sender: emv@msen.com (Edward Vielmetti, MSEN) Reply-To: moeller@kiti.informatik.uni-bonn.de Organization: (none) X-Original-Date: 14 Jun 91 10:13:06 GMT Date: Sun, 16 Jun 1991 17:47:07 GMT Approved: emv@msen.com (Edward Vielmetti, MSEN) X-Original-Newsgroups: fj.mail-lists.connectionist Lines: 68 Archive-name: ai/neural-nets/fox-decomp/1991-06-14 Archive: cheops.cis.ohio-state.edu:/pub/neuroprose/fox.decomp.* [128.146.8.62] Original-posting-by: moeller@kiti.informatik.uni-bonn.de Original-subject: [Knut Moeller: TR available from neuroprose; learning algorithms] Reposted-by: emv@msen.com (Edward Vielmetti, MSEN) ------- Forwarded Message Date: Thu, 13 Jun 91 09:50:34 +0200 From: Knut Moeller Message-Id: <9106130750.AA01054@kiti.> Subject: TR available from neuroprose; learning algorithms The following report is now available from the neuroprose archive: LEARNING BY ERROR-DRIVEN DECOMPOSITION D.Fox V.Heinze K.Moeller S.Thrun G.Veenker (6pp.) Abstract: In this paper we describe a new selforganizing decomposition technique for learning high-dimensional mappings. Problem decomposition is performed in an error-driven manner, such that the resulting subtasks (patches) are equally well approximated. Our method combines an unsupervised learning scheme (Feature Maps [Koh84]) with a nonlinear approximator (Backpropagation [RHW86]). The resulting learning system is more stable and effective in changing environments than plain backpropagation and much more powerful than extended feature maps as proposed by [RMW89]. Extensions of our method give rise to active exploration strategies for autonomous agents facing unknown environments. The appropriateness of this technique is demonstrated with an example from mathematical function approximation. - ----------------------------------------------------------------------------- To obtain copies of the postscript file, please use Jordan Pollack's service: Example: unix> ftp cheops.cis.ohio-state.edu # (or ftp 128.146.8.62) Name (cheops.cis.ohio-state.edu:): anonymous Password (cheops.cis.ohio-state.edu:anonymous): ftp> cd pub/neuroprose ftp> binary ftp> get (remote-file) fox.decomp.ps.Z (local-file) fox.decomp.ps.Z ftp> quit unix> uncompress fox.decomp.ps.Z unix> lpr -P((your_local_postscript_printer) fox.decomp.ps.Z - ---------------------------------------------------------------------------- If you have any difficulties with the above, please send e-mail to moeller@kiti.informatik.uni-bonn.de DO NOT "reply" to this message!! ------- End of Forwarded Message -- comp.archives file verification cheops.cis.ohio-state.edu -rw-r--r-- 1 3169 274 745782 Jun 10 10:10 /pub/neuroprose/fox.decomp.ps.Z found fox-decomp ok cheops.cis.ohio-state.edu:/pub/neuroprose/fox.decomp.*