Analysis of ISBSG Data to Understand Testing Effort

by A. Abran & K. Jayakumar

ISBSG R12 data has been investigated to understand the effort expended on the testing phase of software development projects. The available data was filtered to retain web based/client server projects in order to make the results applicable for current generation projects. Data was further filtered to include projects where size was measured using either IFPUG FPA or COSMIC.

Analysis of data resulted in three homogenous groups of projects:
(a) Projects consuming less test efforts (up to 1 hr per function point)
(b) Projects consuming average test efforts (above 1 hr but less than 3 hrs per function point) and
(c) Projects consuming high test efforts (above 3 hrs per function point)

The presentation will provide key statistics computed for each of the three groups including size to effort relationship.  Further analysis of the three groups resulted in two key findings which will be discussed in the presentation:
(1) More than 60% of the projects which consumed less testing efforts (first group) had specification reviews/ design reviews/ code reviews as part of their development phase while projects in the second and third groups had only a small percentage of projects which had such reviews prior to testing.
(2) Another interesting observation from the analysis is related to test automation where in over 90% of projects where tests were automated consumed less testing efforts.
Even though both of the above observations are intuitive, the study with ISBSG data provides quantitative objective support.

The presentation will also provide key test effort statistics on various subset of R12 data such as Development Projects, Enhancement Projects, Business Applications, Real-time applications, Projects measured using IFPUPG FPA and Projects measured using COSMIC. As there are no ISBSG reports specifically on software testing efforts, this information will be useful to the industry.

Download presentation