This page has been archived.
Information identified as archived on the Web is for reference, research or recordkeeping purposes. It has not been altered or updated after the date of archiving. Web pages that are archived on the Web are not subject to the Government of Canada Web Standards. As per the Communications Policy of the Government of Canada, you can request alternate formats on the "Contact Us" page.
Table 2 presents the evaluation issues and questions that were derived from the logic model developed by the evaluation team. The detailed evaluation matrix, which outlines the indicators and data sources used to address the evaluation questions, is presented in Appendix B.
Evaluation Issue | Evaluation Questions |
---|---|
Relevance |
|
Effectiveness |
Short Term
|
Economy |
|
The evaluation framework uses multiple lines of evidence and complementary research methods to ensure the reliability and validity of the data. The following research methods were used:
Each of these methods is described in more detail below.
Document review. Three main types of documents were reviewed during the evaluation:
For the full list of documents reviewed, please see Appendix C. Note that the evaluation did not review or assess Treasury Board submissions and pr�cis for quality.
Interviews. Twenty-six interviews were completed (Table 3 and Appendix D). Interviewees included
Interview Group | Number of Interviews |
---|---|
Program analysts and COE analysts | 11 |
Representatives from federal organizations | 12 |
External stakeholders | 3 |
Total | 26 |
All interviews were conducted by telephone. Interviewees were sent an interview guide (see Appendix E) before the interviews were conducted.
Survey. A survey was administered over the Internet to program analysts, COE analysts, and representatives of federal organizations that put forward a Treasury Board submission in the last five years. A total of 547 individuals were asked to complete the questionnaire; 220 useable responses were received, for an overall response rate of 40% (see Table 4).19
Survey Group | Total Sent | Received | Removed | Total Kept | Response Rate | Confidence Interval |
---|---|---|---|---|---|---|
Program analysts | 135 | 60 | 0 | 60 | 44.4% | 95% � 9.5% |
COE analysts | 181 | 66 | 0 | 66 | 36.5% | 95% � 9.6% |
Federal organizations | 231 | 99 | 5 20 | 94 | 40.7% | 95% � 7.8% |
Total | 547 | 225 | 5 | 220 | 40.2% |
All of the Secretariat's program analysts were invited to participate in the survey. They were also asked to provide contact information for all COE analysts they had consulted for advice on Treasury Board submissions during the last five years. Furthermore, the program analysts were asked to provide contact information for the individuals in federal organizations (including the Secretariat) that put forward a submission within the last year. Federal organizations were encouraged to forward the survey to any individual within the organization who had been involved in the Treasury Board submission review process.
Survey results are provided in Appendix G.
Working session with ADMs. A two-hour working session was conducted to gather qualitative information on the relevance, effectiveness, and economy of the Treasury Board submission process. Assistant secretaries from the Secretariat and selected ADMs responsible for corporate and/or strategic planning as well as selected departmental chief financial officers (CFO) from federal organizations were invited to participate. The issues to be discussed during the session (see Appendix F) were provided to the participants in advance.
Administrative, financial, and statistical data. Administrative, financial, and statistical data were gathered for the purpose of assessing the effectiveness and efficiency of the submission process. The evaluation team worked with the Treasury Board Submission Centre to review data related to Treasury Board submissions and to gain a greater understanding of the Submission Tracking System (STS). The evaluation team also reviewed data from the Management Accountability Framework (MAF) database.
Costing. Evaluating economy requires an analysis of the costs involved throughout the submission process. A costing exercise was undertaken; however, because direct costs related to Treasury Board submissions are not tracked separately, only the level of effort of some participants involved in the submission process was available. Section 6(c)(i), "Resources allocated to the submission process," therefore does not identify an estimated cost for the process.
Related international practice: The evaluation team undertook a limited review of submission process models used in other international jurisdictions.21 Given the cursory nature of this review, the evaluation team could not draw conclusions on the appropriateness of other models compared to the Canadian context. Findings from this review are therefore not presented in this report, though some interesting information was discovered during the review. For example, the role played by the Secretariat's assistant secretaries-whereby they present a federal organization's submission to Treasury Board-may be unique internationally. As is the case in the Canadian federal model, Secretariat equivalents in other jurisdictions are responsible for logistical and technical functions related to sessions of Cabinet, strategic and work planning, policy advice, legal functions, some monitoring functions, and their own internal management functions.22 They scrutinize material presented to Cabinet, ensuring that legal and policy considerations have been accounted for within structured submissions. In six of ten surveyed Organisation for Economic Co-operation and Development (OECD) countries,23 the Secretariat equivalent prepares a recommendation on how the submission should be handled in the Cabinet-level meeting. In these jurisdictions, however, it is the deputy minister of the submitting organization who presents the submission at the Cabinet-level meeting and not the equivalent of a Secretariat assistant secretary. Assistant secretary equivalents can therefore focus their attention on submissions that are most strategic or sensitive or for which their recommendation runs counter to that of the submitting organization.
Other information. Once the data were collected and analyzed, information gaps were discovered in a few key areas. To fill these gaps, the following methodologies were used:
Time frame. A federal election was called shortly after the evaluation was launched. Not long after the election, Parliament was prorogued. These events delayed approval for the opinion research to be conducted for the evaluation; consequently, the time available to perform the research was limited. Another consequence of the election and subsequent prorogation was that the evaluation team could not interview Treasury Board ministers regarding the support they receive through the Treasury Board submission process.
Logic model. While the evaluation team believes that the ultimate outcome proposed in the logic model it developed is a valid description of the purpose of the Treasury Board submission process and therefore a valid basis for the evaluation, it was not possible to validate the ultimate outcome with an appropriate cross-section of stakeholders.
Review of performance measurement data. Performance data have not been collected on all aspects of the Treasury Board submission process. For instance, data are not collected on the extent to which the Treasury Board submissions officially submitted by federal organizations are actually required, or their compliance with policies and processes, and on the extent to which Treasury Board decisions reflect recommendations in the pr�cis. In the absence of this information, the evaluation team was unable to assess the quality of submissions and pr�cis. The evaluation therefore relied more heavily on survey and interview data to assess the effectiveness of the process.
Administrative data. Limited administrative data were available on the submission process and its results. Furthermore, as noted in the Treasury Board of Canada Secretariat Audit of Leave and Overtime (2008),24 some overtime data are not reliable, thereby limiting the extent to which the evaluation team could use such data to assess the amount (and related costs) of overtime claimed by program analysts in connection with Treasury Board submissions. In addition, other administrative data such as the number of days between the Treasury Board decision and the issuance of the decision letter would have provided additional lines of evidence.
Surveys and interviews. Given the highly variable nature of Treasury Board submissions, it stands to reason that the submission process experience would differ significantly from one submission to the next. More interviews would have provided better data on the effect of variations in a submission's size, scope, value, and complexity. The evaluation methodology attempted to address this limitation by inviting individuals from all federal organizations to participate in the Web-based survey. The results of the survey were cross-validated with the interview responses.
Costing methodology. Cost information to support a complete and accurate costing of the Secretariat's involvement in the Treasury Board submission process was not available.
Single-window approach to service delivery. Because the Secretariat uses a single-window approach for submissions, in which program analysts and their directors or executive directors are the point of contact for representatives from federal organizations, interaction between COE analysts and representatives from federal organizations is extremely limited. As a result, despite COE analysts having a clear role and contribution at the pre-submission and draft stages of the submission process, representatives from federal organizations may not be fully aware of the extent of this role. This may have led to the evaluation's greater focus on program analysts.
While there are some limitations with the evaluation methodology, multiple lines of evidence were used to draw conclusions about the Treasury Board submission process, strengthening the reliability and validity of the evaluation results. Despite the limitations, the methodology meets the requirements of the Treasury Board Policy on Evaluation and associated standards.