Path: utzoo!attcan!utgpu!jarvis.csri.toronto.edu!mailrus!purdue!tut.cis.ohio-state.edu!snorkelwacker!mit-eddie!rutgers!texbell!texsun!pollux!ti-csl!m2!oh From: oh@m2.csc.ti.com (Stephen Oh) Newsgroups: comp.dsp Subject: Re: FFT vs ARMA (was FFTs of Low Frequency Signals (really: decimation)) Message-ID: <100052@ti-csl.csc.ti.com> Date: 29 Nov 89 18:45:05 GMT References: <1989Nov28.185555.4259@athena.mit.edu> Sender: news@ti-csl.csc.ti.com Reply-To: oh@m2.UUCP (Stephen Oh) Organization: TI Computer Science Center, Dallas Lines: 69 In article <1989Nov28.185555.4259@athena.mit.edu> ashok@atrp.mit.edu (Ashok C. Popat) writes: >I'm not exactly sure what you mean by "too strong" --- it's a "given" >in the problem. Are you saying that in many applications, waveforms >cannot be usefully modeled as ergodic? Yes. >I guess then I could >have described my hypothetic source as being WSS over 10^6 samples. A >poor model for speech and images, but realistic in other applications. Sure you can do that. But, still I wonder whether 10^6 wss samples are good or not since 10^6 samples are too big. Don't you think? When I said partially wss, for some portions of data: it is wss but for whole data: it is not. >Good point. I thought about this and here's what I came up with. The >duration-bandwidth uncertainty principle says (for continuous-time >waveforms) that > delta_t*delta_f >= 1/pi >where delta_t is the time window size and delta_f is the frequency >resolution (see William Siebert, _Circuits, Signals, and Systems_). >I'm sure a similar result applies in the discrete-time case, but I >don't have a reference off hand --- I'll assume it has the same form. The resolution = 1/N where N is the number of samples. >Now if you're starting with only 100 samples, the uncertainty >principle says that there's simply not enough information in the data >to get a high-resolution spectrum. If you do manage to get a >high-resolution spectrum, the necessary added information must have >come from the model, not the data. What do you think? I don't know why you brought up the uncertainty principle, but is there any measure that 100 samples are not enough to get a high resolution estimate? (This is not a flame, I just want to know) Your statement is true though: From Kay's book (ISBN 0-13-598582-X), "Windowing of data or ACF values makes the implicit assumption that the unobserved data or ACF values outside the window are zero, which is nomally an unrealistic assumption. A smeared spectral estimate is a consequence of the windowing. Often, we have more knowledge about the process from which the data samples are taken, or at least we are able to make a more reasonable assumption other than to assume the data of ACF values are zero outside the window." And I agree with Kay. >Any recommended reading on these techniques? 1. H. Akaike, "A New Look at Statistical Model Identification," IEEE Trans. Automat. Contr., Vol. AC-19, 1974 2. E. J. Hannan, "The Estimation of the Order of an ARMA Process," Ann. Statist., Vol.8 1980. 3. R. L. Kashyap, "Optimal Choice of AR and MA parts in Autoregressive Moving Average Models," IEEE Trans. Pattern Anal. Mach. Intell., Vol PAMI-4, 1982 4. L. Marple, "Digital Spectral Analysis with Applications," Prentice-Hall, 1987 5. S. Kay, "Modern Spectral Estimation," Prentice-Hall, 1988 +----+----+----+----+----+----+----+----+----+----+----+----+----+ | Stephen Oh oh@csc.ti.com | Texas Instruments | | Speech and Image Understandung Lab. | Computer Science Center| +----+----+----+----+----+----+----+----+----+----+----+----+----+ Brought to you by Super Global Mega Corp .com