Path: utzoo!attcan!uunet!decwrl!asylum!osc!jgk From: jgk@osc.COM (Joe Keane) Newsgroups: comp.arch Subject: Re: Resolution, etc. Summary: Dithering is neat. Message-ID: <4042@osc.COM> Date: 29 Nov 90 01:44:19 GMT References: <240@csinc.UUCP> <1990Nov15.052925.1265@imax.com> <2928@crdos1.crd.ge.COM> <1990Nov19.195042.19240@imax.com> <4027@osc.COM> <1990Nov23.182147.26688@zoo.toronto.edu> <1990Nov23.204238.12597@elroy.jpl.nasa.gov> Reply-To: jgk@osc.COM (Joe Keane) Organization: Versant Object Technology, Menlo Park, CA Lines: 65 In article <4027@osc.COM> jgk@osc.COM (Joe Keane) writes: >Quantization of display colors is a small problem, since you can use dithering >to get what you want... In article <1990Nov23.182147.26688@zoo.toronto.edu> henry@zoo.toronto.edu (Henry Spencer) writes: >How well does dithering work for animation? (Serious question, I'm not up >on the fine points of this.) Many display artifacts that don't look too >serious in a still image become glaringly obvious when they change from >frame to frame. It gets more complicated, but i figure if you're doing animation you're used to that. Dithering each individual frame doesn't work very well, since you get all sorts of annoying moving patterns. Basically you want to dither in three dimensions, although they're not weighted the same. As you can imagine, the error diffusion algorithms get pretty complex. But if it's done right, the errors in a given frame get compensated by those in surrounding frames. >"I'm not sure it's possible | Henry Spencer at U of Toronto Zoology >to explain how X works." | henry@zoo.toronto.edu utzoo!henry Heh heh, i don't know about that, but i'm not sure it's possible to explain _why_ X works the way it does. >I'd also note that for some classes of images, like scientific visualization, >it is not acceptable to mess with the pixels to make it look better. In article <1990Nov23.204238.12597@elroy.jpl.nasa.gov> alan@cogswell.Jpl.Nasa.Gov (Alan S. Mazer) writes: >Absolutely, which is why we don't use dithering around here. It's bad enough >having to do the quantization, but if we dithered, individual pixels would be >almost useless. Dithering is fine if you want to make pretty pictures, but >if you are actually using the picture for something analytical you lose a lot >of information. Maybe i'm missing something, but i don't see why visualization is that much different from pretty pictures. Specifically, i'd say something is seriously wrong if a single pixel is really that critical. That's what zooming in is for, right? Personally i can't spend all day looking at individual pixels. And it's only going to get worse as displays get higher resolution. Here is an example: i had a simple program to view grey-scale pictures on a 1-bit machine. This isn't really the same thing we're talking about, since dithering black vs. white is a lot more obvious than dithering small variations in color. Anyway, initially it would map one pixel in the picture to one pixel on the screen. Of course you lose individual pixels unless they have high contrast. But then i added zoom-in features, with some fairly simple interpolation. The results look quite good; you can see every last detail in the original picture at about 8x expansion. It was a bit on the slow side, but that's what i get for using a lowly Sun-3. Let me propose a simple test. Suppose we have a choice between say a 1000x1000 pixel display with 48 bits of true color, and a 2000x2000 pixel display with a 4096-color palette. They're the same number of bits, but which is better? I'd take the high-resolution one in a second, assuming good software like i said before. You can do the large expanses of slowly changing colors almost as well as the low-resolution one. But then the sharp edges have four times the resolution. My main point is that you can always give up resolution for more accurate colors, but there's no way to get better resolution than what you have. Don't get me wrong, dithering is not a trivial task, and sometimes i think people get it wrong more often than right. Taking a display from pretty good to photograph quality takes good software and often a good amount of CPU. But it can be done. Brought to you by Super Global Mega Corp .com