To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
The technical storage or access that is used exclusively for statistical purposes.
The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.
Price per FP Contract
/in benchmarkingA consultancy firm assisted an IT Services provider to win a large, European government Price-per-FP contract using ISBSG data. The data, for more than 11000 software projects, provided valuable benchmarks that were used to compare productivity and cost. This enabled the IT Services provider to produce a competitive price-per-FP offer that beat 20 rival bidders. Learn about the strategic value of ISBSG benchmarks in fixed-price, functional sizing contracts.
Benchmarking Agile Development Teams
/in benchmarkingA telecommunications provider faced challenges in assessing the performance of its agile development teams, working on Java-based projects. The company lacked a standardized approach to measure key performance indicators. It was unable to determine how the team compared to industry standards. A lack of data-driven insights prevented the company from identifying areas of improvement. The benchmarking process used to evaluate the agile development teams relied upon ISBSG data. It involved a structured, data-driven approach to measure productivity, delivery speed, cost efficiency, and process quality. The process was designed to compare the company’s internal performance metrics against ISBSG’s industry benchmarks to identify gaps and opportunities for improvement.
Estimation of a Redevelopment Project
/in estimationAn international, pharmaceutical company had to redevelop functionality in a critical system that was integrated with numerous other applications, including clinical data management processes. To address this, the company initiated two options: Identify a replacement package with similar functionality OR redevelop the functionality themselves.
This short report focuses on the estimation process for the redevelopment option, using ISBSG data to deliver a robust, data-driven estimate. It highlights the value provided by this approach.
Software Complexity for Estimation
/in estimationAn empirical study, using ISBSG data, challenges the long-held belief that Unadjusted Function Points (UFP) provide more accurate effort estimates for complex projects than simplified measures like Simple Function Points (SFP).
IFPUG SNAP
/in sizingEffectively identifying and understanding Non-Functional Requirements (NFRs) is the current frontier in software projects. The models and standards that define them evolve, closely tied to technological advancements. This is a challenge in the field of software measurement. Today, IFPUG SNAP is the first ISO/IEEE standard for sizing these dimensions of software. Now is the time to start collecting historical data for these new sizing units — and a new ISBSG Data Collection Questionnaire (DCQ) is ready to support this effort.
Collect Analyse and Verify
/in project controlAccurate problem setting is the foundation of successful projects, but it can be compromised if information is not rigorously collected and validated. The BABOK® Guide (Business Analysis Body of Knowledge) of the International Institute of Business Analysis—defines Elicitation and Collaboration as a key knowledge areas. They are the foundation for defining project requirements. This talk analyzes elicitation as a cyclical process that supports the quality of problem setting and improves requirements traceability. The five key elicitation activities according to BABOK® are illustrated, along with the applicable techniques for the functional collection of information.
Benchmarking Outsourcing Application Development Contracts
/in benchmarkingThe effective benchmarking of outsourcing application development contracts is critical to optimize performance, cost, and quality. This presentation explores the use of ISBSG data to obtain robust benchmarks for outsourced software development projects. We discuss: selecting relevant ISBSG data, aligning benchmarks with project goals, and addressing challenges in comparing diverse outsourcing contracts. Attendees will gain insights on vendor performance, inefficiencies, and competitive pricing. Real case studies illustrate the use of ISBSG data to optimize outsourcing agreements.
Functional Measurement of Development
/in estimationThis presentation focuses on measuring content development and publishing features to provide guidelines for sizing CMS development activities that add functionality to the CMS system.
Benchmark Act Improve
/in benchmarkingOrganisations spend much on the development and maintenance of software applications. IT managers often don’t know if productivity, cost, speed of delivery and quality of the deliverables are competitive. When outsourcing, suppliers are often chosen based on hourly rates and not on capability. Benchmarking highlights how development & maintenance organizations perform and where they can improve. This presentation explores benchmarking and how it can be performed using the ISBSG repository.
Quantitative Dependability
/in project controlTechnological advances in modern Train Control Systems (TCS) improve their dependability, safety, availability, and capacity. However, this raises concerns about the effects that uncertainty in critical TCS parameters (e.g. position and speed) have on dependability-related attributes. This can be examined through quantitative measures, stochastic modeling and evaluation of TCS. In this paper, we illustrate the results of a systematic literature review on quantitative evaluation of dependability-related attributes of TCSs under uncertainty on vital parameters.
Mobile App Development
/in productivity, project controlMany organizations provide mobile applications (i.e. apps) for use on smart phones and tablets. Apps are an interface between a company and its customers, providing functionality and support. Hence, companies must be able to quickly add new functionality to apps. This short report compares key metrics for enahancements to Android and IOS apps to determine if there is a difference in their development platforms.
Agile Planning and Monitoring with Kanban and Measurement
/in agileDetermining IT workforce requirements is an important part of software development planning. To achieve this, a work model supported by a parametric, repeatable, and auditable functional size measurement method, should be used. This enables organizations to plan and monitor development in a simple and agile manner, especially when adopting frameworks like Kanban or Scrum. Having a measurement-based historical database populated with trustworthy data is crucial for effective planning and monitoring of software development in organizations striving for software development maturity.
Benchmarking Software Productivity and Quality for Large-Language Models and Low-Code/No-Code
/in benchmarkingCode generation through Large Language Models and the adoption of Low-Code/No-Code platforms have transformed traditional software development. A reassessment of conventional benchmarking methodologies is required. Updated productivity models that accurately reflect these contexts are required. Through benchmarking standards and metrics that reflect these innovative development modalities, organizations can better understand the true impact of these technologies on software delivery efficiency and product excellence.
Effort Estimation in Software Development Projects Using Supervised Machine Learning
/in estimationThis presentation analyses a machine learning model, that uses ISBSG Data for effort estimation. It accuracy is discussed along with the future. Such machine learning models support the development of predictive tools tailored to individual organizational contexts.
Evaluating Digital Intangible Assets using ICT Measurement Standards
/in project controlThis presentation explores accounting reporting methods that use technical and economic measurements of intangible assets. By leveraging measurement standards such as FPA (Function Point Analysis), SNAP (Software Non-functional Assessment Process), and project management methodologies, a comprehensive understanding of the fair value of digital assets can be achieved.