Path: utzoo!attcan!uunet!mcsun!ukc!warwick!esrmm From: esrmm@warwick.ac.uk (Denis Anthony) Newsgroups: comp.ai.neural-nets Subject: Dynamic Parameter Tuning Keywords: Learning rate, momentum Message-ID: <1991Feb6.113616.19944@warwick.ac.uk> Date: 6 Feb 91 11:36:16 GMT Sender: news@warwick.ac.uk (Network news) Organization: Computing Services, Warwick University, UK Lines: 37 I have tried the Vogl et al %Q Vogl T.P, Mangis J.K, Rigler A.K, Zink W.T, and Alkon D.L %D 1988 %T Accelerating the Convergence of the Back-Propagation Method %J Biological Cybernetics %V 59 %P 257-263 method of tuning learning rate and momentum to speed convergence. It is suggested that this method increases convergence speed, and that epoch learning is better than pattern learning. Results are shown for the former but the latter is not tested in that study. I have found that the method does not work on my data, which is a variety of image data trained to learn classification of images, and to do inverse problem solution for tomographic images. Specifically I find that pattern learning is quicker and gives similar error with constant learning rate, and that dynamic learning rate is not an improvement. I find that on at least one problem, which I have repeated several times, that the dynamic method converges faster, but to a much higher error than static learning rate (this is using the same initial learning rate in both cases). The error for dynamic never goes lower than the static case at any given epoch number in these particular runs. All this seems counter-intuitive to me, as the method seems sensible (raise learning rate when error dropping, reduce it when error rising). I am particularly surprised as the learning rate does progressively go up. Apart from a coding error (always possible) are there any suggestions as to whether this is predictable and reasonable behaviour, and if so then why ? (I am using the PDP back prop software (McLelland and Rumelhart) with a few lines inserted to make the learning rate adapt). Denis Brought to you by Super Global Mega Corp .com