Treasury Board of Canada Secretariat
Symbol of the Government of Canada

ARCHIVED - Establishing the Baseline - Government-Wide Summary and Analysis of IT Project Practices


Warning This page has been archived.

Archived Content

Information identified as archived on the Web is for reference, research or recordkeeping purposes. It has not been altered or updated after the date of archiving. Web pages that are archived on the Web are not subject to the Government of Canada Web Standards. As per the Communications Policy of the Government of Canada, you can request alternate formats on the "Contact Us" page.

2. Methodology

This section describes the approach used to establish the baseline.

2.1 The Foundation

The Software Engineering Institute's (SEI) Software Capability Maturity Model (SW-CMM) 4 and the corresponding CMM Based Appraisal for Internal Process Improvement (CBA-IPI) method was selected as the foundation for the baselining methodology. This method was preferred over other available methods because of its widespread support in the industry and its ability to provide the government with a thorough assessment of the processes currently implemented within departments.

The CBA-IPI approach, however, requires significant investment in terms of both human and financial resources, and it considerably impacts organizations5. As a result, the approach used was one that respected the principles and practices required in the CBA-IPI, yet minimized the impact on departmental participants. In essence, the methodology was streamlined to:

  • Enable individual departmental assessments to be done in half a day; and
  • Minimize the number of required participants by using a small but knowledgeable and experienced sample of IT practitioners and managers.

The streamlined methodology consisted of administering a questionnaire that was based upon one developed by the Software Productivity Centre (SPC) for their SoftGuideTM product. The SoftGuideTM questions apply to organizations currently operating at Level 1 or 2 of the SEI S/W-CMM. The terminology and degree of formality of the questions in SoftGuideTM are more suitable to the size and structure of Canadian government IT organizations than is SEI's Maturity Questionnaire. The SoftGuideTM approach has been used successfully in over 30 assessments.

2.2 The Questionnaire

The SoftGuideTM questionnaire contains 89 questions from the 13 Key Process Areas corresponding to Levels 2 and 3 of the SEI SW-CMM.6 In order to properly address the full scope of the Enhanced Framework, the SoftGuideTM questionnaire was expanded to:

  • Address Project Governance issues identified in the Enhanced Framework;
  • Cover the system-level activities normally preceding and following the SW-CMM Key Process Areas; and
  • Address all types of government IT projects, including Software Acquisition and infrastructure projects.

Despite the substantial differences between departments—in size of IT expenditures, in IT management styles, and in IT Service Delivery Models—the questionnaire was applicable to all participating departments. Participants were able to relate to the questions and respond appropriately whether projects were traditional software development, commercial off the shelf (COTS) acquisition, or infrastructure upgrades. This shows that the Key Practice Areas assessed are generic and validates the baseline results as a tool that can provide guidance towards improvement. The final list of questions is provided in Appendix 1.

2.3 The Participants

Twenty of the largest departments in terms of IT-related expenditures were solicited to participate in the baselining process. Their level of IT expenditure was based on 1996/1997 data from a Central Accounts report dated June 5, 1997. All departments responded positively and key representatives from each participated in a half-day workshop conducted for their department. The list of participating departments and participants is provided in Appendix 2.

2.4 The Workshops

All the workshops were conducted from November 1997 to March 1998 by the authors of this report. Everyone participating in the workshop was given a brief presentation that described the Enhanced Framework, the SEI SW-CMM, and the assessment process. This presentation preceded the administration of the questionnaire.

The possible responses7 to the questions were as follows:

  • Yes when:
    – The practice is well established and consistently performed;
    – The practice is almost always followed; and
    – The practice is considered standard operating procedure.
  • No when:
    – The practice is not well established or is not consistently performed; and
    – The practice may be performed sometimes, or even frequently, but it is omitted under difficult circumstances.
  • Does Not Apply when:
    – Participants had the required knowledge about the project or organization and about the question asked, but felt the question did not apply to their department. For example, the section on "Subcontract Management" will not apply if the department does not subcontract work.
  • Don't Know when:
    –Participants did not know how to answer the question.

Rather than strictly observing the Yes/No criteria defined above, common sense was used to determine when to respond Yes or No. The "80/20" rule was adopted: when 80 percent of projects within a department implemented the practice, participants answered Yes. Such an interpretation of the Yes/No criteria is not unusual in assessment methodologies.

In addition, participants often qualified their responses with "Yes, but…" or "No, but…" followed by an explanation. Sometimes discussion of the perceived need—or lack of need—for improvement in a specific area led to the decision on what to respond. Participants used the Comments section of the questionnaires to qualify their response or to record the results of their discussions.

During each workshop, participants reached an understanding as to what constituted a "project" within a given department. This consensus was necessary in order to provide a sound and consistent basis for responding to the project-related questions. While these definitions were not necessarily consistent across all departments, TBS PMO representatives enforced an acceptable range of definitions to preserve the integrity of the baseline results. This understanding was documented in the workshop record.

All answers were recorded and sent to the participating departments for internal review and validation. These validated results provide the basis for the baseline.