Path: utzoo!utgpu!jarvis.csri.toronto.edu!mailrus!ncar!boulder!agcsun!marks From: marks@agcsun.UUCP (Mark Shepherd) Newsgroups: comp.software-eng Subject: Re: Predicting Software Ship Dates Summary: it depends... Message-ID: <601@agcsun.UUCP> Date: 30 Nov 89 23:43:47 GMT References: <15030@joshua.athertn.Atherton.COM> Organization: Ampex VSD Golden Engineering, Golden, CO Lines: 39 In article <15030@joshua.athertn.Atherton.COM>, joshua@athertn.Atherton.COM (Flame Bait) writes: > I have a simple question, I'm looking for a simple answer :-) > > Given any information you want, predict when a software project will have > fewer than N bugs, and make the prediction after "new" coding has stopped, > but while debugging and testing is still going on. > I don't believe that there is a reliable, quantitative way of producing the answer you're looking for. I do believe that the answer depends (at least in part) on the following: How well do you understand the requirements? How good is the design? How good is the designer? How good are the tools? How much time was spent in requirements analysis and design? Have you ever done this sort of thing before? Are there mutually dependant subsystems (eg hardware and software) that must be simultaneously debugged? Was the project done by a small coherent team? Was the team all in one location? Does your management place higher priority on schedule, or on quality? Did you prototype the system, or at least the tricky or poorly understood parts? Are you willing to redesign/rewrite the pieces that turn out to be poorly designed or implemented? Is there a comprehensive test plan? I'm afraid that these questions are much more difficult to answer than the usual QA ones (how much paper was produced? was MIL-STD abc123 followed?). Sorry. Mark Shepherd Ampex Corporation ...!ucbvax!avsd!dse!agcsun!marks These are my opinions. Brought to you by Super Global Mega Corp .com