ISBSG promotes the sharing of information and insights with the software metrics community. To fulfil this aim, ISBSG hosts the annual “IT Confidence Conference”. This is a forum for software metrics experts from around the world to present the latest news and research in the field. ISBSG may also host smaller conferences throughout the year. All conference presentations are found here.
Presentations from
ISBSG’s 2025Â IT Confidence Conference
Quantitative Dependability Evaluation of Train Control Systems
A. Fantechi – University of Firenze
Technological advances in modern Train Control Systems (TCS) have led to vast improvements. However, this raises concerns about the effects that uncertainty in critical TCS parameters have on dependability-related attributes. This can be examined through quantitative measures, stochastic modeling and evaluation of TCS. In this paper, we illustrate the results of a systematic literature review on quantitative evaluation of dependability-related attributes of TCSs under uncertainty on vital parameters.
Become Competitive – Benchmark, Act, Improve
Pierre Almen – ISBSG
Organisations spend much on the development and maintenance of software applications. IT managers often don’t know if productivity, cost, speed of delivery and quality of the deliverables are competitive. When outsourcing, suppliers are often chosen based on hourly rates and not on capability. Benchmarking highlights how development & maintenance organizations perform and where they can improve. This presentation explores benchmarking and how it can be performed using the ISBSG repository.
Functional Measurement of Development in a CMS Environment
M. Sgamma– NTT Data
All Content Management Systems (CMS) offer functions to customize publishing features, and these customizations are often part of software development projects. This presentation focuses on measuring content development and publishing features to provide guidelines for sizing CMS development activities that add functionality to the CMS system.
Benchmarking Outsourcing Application Development Contracts with ISBSG Data
Harold van Heeringen – IDC
The effective benchmarking of outsourcing application development contracts is critical to optimize performance, cost, and quality. This presentation explores the use of ISBSG data to obtain robust benchmarks for outsourced software development projects. Attendees will gain insights on vendor performance, inefficiencies, and competitive pricing. This presentation equips IT leaders with strategies to maximize value and mitigate risks in outsourcing.
Collect, Analyze, and Verify: Elicitation as the Key to Problem Setting
D. Dini, IIBA-Italy
Accurate problem setting is the foundation of successful projects. The BABOK® Guide (Business Analysis Body of Knowledge) of the International Institute of Business Analysis—defines Elicitation and Collaboration as a key knowledge areas. They are the foundation for defining project requirements. The five key elicitation activities according to BABOK® are illustrated, along with the applicable techniques for the functional collection of information.
IFPUG SNAP and its ISBSG Data Collection Questionnaire
L. Buglione, F. Di Cola – ISBSG
Effectively identifying and ensuring a shared understanding of Non-Functional Requirements (NFRs) is the new frontier in software projects. Their standards have evolved and this is a challenge in the field of software measurement. IFPUG SNAP is the first ISO/IEEE standard for sizing these dimensions of software. Historical data can be used to analyse projects with these new sizing units. A new ISBSG Data Collection Questionnaire supports this effort.
What software complexity do we need for software effort estimation? How ISBSG data analysis may contribute to deconstruct cognitive biases
R. Meli – DPO
An empirical study, using ISBSG data, challenges the long-held belief that Unadjusted Function Points (UFP) provide more accurate effort estimates for complex projects than simplified measures like Simple Function Points (SFP). The study concludes that the traditional definition of complexity, based on Data Element Types (DETs) and File Type Referenced (FTR), is not a significant factor in improving the accuracy of software effort estimation. This suggests that a new definition of complexity may be needed.
IT Confidence Conference Archives
View all presentations from previous conferences. Presentations are archived by conference year. Click on a presentation title, below, to see the presentation.








