these two datasets to learn how we can improve our own review process." By using the Leap toolkit we are able to easily compare the defects we find in review against the defects the journal reviewers find. We should be able to develop checklists to help us find and remove the defects found by those external reviewers.
Leap is publicly available at http://csdl.ics.hawai.edu/Tools/LEAP/LEAP.html. Since we publicly announced the release of Leap in December 1998, several individuals, not associated with CSDL, have adopted Leap.
We are looking for industrial partners who are willing to use Leap and allow us to record their experiences. We are going to use Leap to help teach a graduate software engineering class in the fall of 1999. The students will be using Leap to conduct reviews of their fellow classmate's programs.
We are developing the Leap obfuscater and designing the Leap data repository. These two tools should help with the maintenance and distribution of valuable review checklists.
Austin, R. D. ( 1996), Measuring and Managing Performance in Organizations. New York, Dorset House.
Johnson, P. M. & Moore, C. ( 1999). Project leap: Personal process improvement for the differently disciplined., http://csdl.ics.hawaii.edu/Research/LEAP/LEAP.html.
Johnson, P. M. ( 1996). Measurement Dysfunction in Formal Technical Review. "Technical Report ICS-TR-96-16", Department of Information and Computer Sciences, University of Hawaii, Honolulu, Hawaii 96822.
Johnson, P. M. & Tjahjono, D. ( 1998). "Does every inspection really need a meeting"? Journal of Empirical Software Engineering, 4 ( 1), 9-35.
Tjahjono, D. ( 1996) Exploring the effectiveness of formal technical review factors with CSRS, a collaborative software review system. Ph.D. thesis, Department of Information and Computer Sciences, University of Hawaii.