Xref: utzoo comp.arch:11440 comp.lang.misc:3484 Path: utzoo!attcan!utgpu!jarvis.csri.toronto.edu!mailrus!ncar!unmvax!bbx!bbxsda!scott From: scott@bbxsda.UUCP (Scott Amspoker) Newsgroups: comp.arch,comp.lang.misc Subject: Re: Fast conversions, another urban myth? Keywords: BCD, radix-conversion, COBOL Message-ID: <103@bbxsda.UUCP> Date: 18 Sep 89 16:56:34 GMT References: <832@dms.UUCP> Reply-To: scott@.UUCP (Scott Amspoker) Organization: /etc/organization Lines: 25 In article <832@dms.UUCP> albaugh@dms.UUCP (Mike Albaugh) writes: > > Anyway, the "conventional wisdom" on comp.arch was that: > > BCD math is inherently slow > Cobol doesn't do much math anyway > ergo: "modern" processors don't need BCD math, just convert to > binary and do the math there, then back to BCD for output. > This has come up a lot with us since we do business-oriented software. I have come to discover that there are a few other considerations when converting BCD to binary and back again: 1) Rounding errors can and do creep in. 2) Many business software requires rounding of results to a specified number of decimal places. This cannot be done easily in binary. One solution is to do it after you return to BCD but you might end up rounding up and then you're back to messy BCD digit manipulation. If decimal precision is required one can always consider a binary format that has an implicit decimal point on the right rather than the left. There are other problems associated with that too.