Path: utzoo!attcan!utgpu!jarvis.csri.toronto.edu!mailrus!tut.cis.ohio-state.edu!cs.utexas.edu!csd4.milw.wisc.edu!leah!rpi!batcomputer!cornell!rochester!pt.cs.cmu.edu!sei!sei.cmu.edu!mcp From: mcp@sei.cmu.edu (Mark Paulk) Newsgroups: comp.software-eng Subject: Re: code reviews Summary: references and data on reviews/inspections Keywords: peer review, review, inspection, walkthrough Message-ID: <9474@aw.sei.cmu.edu> Date: 20 Jun 89 16:42:41 GMT References: <116@opel.UUCP> <5801@hubcap.clemson.edu> Sender: netnews@sei.cmu.edu Reply-To: mcp@bs.sei.cmu.edu.UUCP (PUT YOUR NAME HERE) Organization: Carnegie-Mellon University, SEI, Pgh, Pa Lines: 94 In article <5801@hubcap.clemson.edu> ofut@hubcap.clemson.edu (A. Jeff Offutt) writes: >From article <116@opel.UUCP>, by johnk@opel.UUCP (John Kennedy): >> In article <12047@bloom-beacon.MIT.EDU> tada@athena.mit.edu (Michael Zehr) writes: >> [...] >>> >>>Since part of my job here is to improve programmer productivity, i'm >>>... >>>periodic peer reviews. >>> >>>michael j zehr > > >> I can't help but chuckle when I see code reviews and productivity in the >> same paragraph. It's good for the project, it's what the customer wants, > >Now be careful that when you say "increase in overhead and a decrease in >productivity" we are all counting the same apples. It is true that >code reviews will take time and energy away from other tasks, but it can >also save time and energy later on. Reviewing code can significantly reduce The following references have some interesting data and insights on the role and power of inspections. Unfortunately it is difficult to get hard data on the effectiveness of methods such as inspections, but there has been some work published which is much more pertinent than my opinion would be... Pertinent quotes follow each reference. Let me also emphasize correct definition of terms! Inspections, reviews, walkthroughs, etc., are frequently overloaded terms. For the "official" definitions, see IEEE Standard 1028 (or Fagan's original paper for that matter). There are distinct differences between the concepts; the most formal is inspection. I use peer review as the all inclusive term because it is not defined in 1028, but some people use it in the hierarchy desk check peer review walkthrough inspection as meaning a 1-1 review with A peer (as opposed to a group of peers). Review, by itself, covers a wide range of sins. Much of this discussion has been oriented at technical reviews, i.e., CDR, PDR class reviews, which are NOT what the original poster was asking about. @b([ACKE89]) A.F. Ackerman, L.S. Buchwald, and F.H. Lewski, "Software Inspections: An Effective Verification Process," IEEE Software, Vol. 6, No. 3, May, 1989, pp. 31-36. Software inspections have been found to be superior to reviews and walkthroughs, p. 31 collection and analysis of data for immediate and long-term process improvement, p. 32 inspections improve quality and productivity, p. 34 inspections give project management more definite and more dependable milestones than less formal review processes, p. 35 @b([GILB88]) T. Gilb, PRINCIPLES OF SOFTWARE ENGINEERING MANAGEMENT, Addison-Wesley, Reading, MA, 1988. The inspection method is the most effective quality control method for software specification documentation we know about. p. 68 Testing is a maximum of 50 to 55% effective at defect identification and removal for a single test process. p. 221 Inspection is about 80% (+/- 20%) effective in removing existing defects. p. 221 The average is five hours saved for every hour invested in inspection. p. 221 Inspected programs were ten times cheaper to maintain than ... similar non-inspected programs. p. 221 High level inspections (requirements and design specification) were the most powerful things they could do. p. 244 @b([GRAD87]) R.B. Grady and D.L. Caswell, SOFTWARE METRICS: ESTABLISHING A COMPANY-WIDE PROGRAM, Prentice-Hall, Englewood Cliffs, NJ, 1987. The five "modern programming practices" which had the strongest correlation with productivity were top-down design, modular design, design reviews, code inspections, and quality assurance programs. p. 20 Projects of highest productivity are among those with the lowest defect densities. p. 140 @b([MYER88]) W. Myers, "Shuttle code achieves very low error rate," IEEE Software, Vol. 5, No. 5, September, 1988, pp. 93-95. 500K Space Shuttle source code with 0.11 errors/KSLOC credited by IBM/FSD to process definition, rigorous inspection of work products across the process, independent software verification, defect-cause analysis, knowledge engineering, expert systems, specialized tools, and "good old value gained from lessons learned." (Barbara Kolkhorst) Reduced from 2 errors/KSLOC, versus 8-10 errors/KSLOC for industry. Effort/time spent on software reconfiguration reduced by 50%. About 85% of major errors discovered in inspection. -- Mark C. Paulk mcp@sei.cmu.edu "The essence of true adulthood is deferred gratification."