This page has been archived.
Information identified as archived on the Web is for reference, research or recordkeeping purposes. It has not been altered or updated after the date of archiving. Web pages that are archived on the Web are not subject to the Government of Canada Web Standards. As per the Communications Policy of the Government of Canada, you can request alternate formats on the "Contact Us" page.
Does the Treasury Board submission process address a demonstrable need, and is it appropriate to the federal government?
The Treasury Board submission process aligns well with the Secretariat's strategic outcome: "Government is well managed and accountable, and resources are allocated to achieve results."25 This strategic outcome is consistent with the responsibilities of Treasury Board ministers as set out in the FAA and therefore speaks to the role of the Secretariat in providing advice and recommendations to Treasury Board ministers through the Treasury Board submission process. The need for a Treasury Board submission process to support ministers is particularly relevant, given the recent focus on responsible spending and the renewal of the expenditure management system, which includes the strategic review of the direct program spending of all departments and agencies of the Government of Canada to reallocate funds from low-priority and low-performing programs to higher priorities.
The Guide states the following:
Improving the quality of information and accountability for results are key elements of the new approach to managing spending across government. The new approach supports managing for results by establishing clear responsibilities for departments to better define the expected outcomes of new and existing programs. It supports decision making for results by ensuring that all new programs are fully and effectively integrated with existing programs and by reviewing all spending to ensure efficiency, effectiveness, and ongoing value for money. Finally, it supports reporting for results by improving the quality of departmental and government-wide reporting to Parliament.26
This excerpt describes the "demonstrable need" that the Treasury Board submission process is meant to address. Essentially, for good management of public programs, the government requires rules, practices, and processes that:
In assessing relevance, the key evaluation issue is therefore whether or not the Treasury Board submission process is capable of meeting these requirements. The question of whether it actually meets these requirements is addressed under the assessment of effectiveness. The question of whether, and to what extent, other processes or mechanisms would be capable of meeting these requirements is outside the scope of this evaluation.
The documents reviewed for this evaluation, as well as the interviews conducted, point to the conclusion that the Treasury Board submission process is relevant to the government's "demonstrable needs" for good public management. As noted above, the submission process aims to ensure that minimum standards are met with regard to:
how the federal organization will carry out the policy initiative; why the proposed method of implementation is the best one; how the proposal contributes to government-wide aims such as accountability, transparency, and interoperability of information; what the expected outcomes and deliverables are, as per the organization's Management, Resources, and Results Structure (MRRS); and how the federal organization will conduct monitoring and evaluation to ensure the program is meeting its policy objectives. This includes progress reports on outcomes, projected efficiency, timelines, and cost targets.27
According to the interviews, general perceptions about the purpose of the submission process broadly correspond to this formal description from the Guide.
This formal description of the Treasury Board submission process is well aligned with the government's stated needs for good public management.
To what extent has the Treasury Board submission process achieved its expected outcomes?
Ideally, this evaluation would assess the effectiveness of the Treasury Board submission process in relation to its impact on the ultimate outcome in the logic model that the evaluation team developed for the purposes of the evaluation, i.e. "program implementation serves the policy outcomes defined by Cabinet in the most economical manner possible and fulfills the government's obligations for transparency, accountability, and prudence." However, attempts to link the Treasury Board submission process directly to this ultimate outcome would face data-gathering and methodological challenges that could not be addressed within the evaluation's scope and its time and resource constraints. The evaluation's assessment of the effectiveness of the Treasury Board submission process is therefore focused on the immediate and intermediate outcomes specified in the logic model developed by the evaluation team. If the evaluation finds that the Treasury Board submission process is generally effective in contributing to these outcomes, then it would be reasonable to conclude that the submission process is also making a significant contribution to the ultimate outcome.
The evaluation therefore focused on the following questions:
The majority of survey participants in all three categories (83.3% of program analysts, 76.9% of COE analysts, and 83% of representatives from federal organizations) believed they had a strong understanding of the Treasury Board submission process. There was no correlation between the length of time respondents had been in their job and their perception of their own understanding of the process.
Survey respondents rated their counterparts' understanding of the Treasury Board submission process less favourably than they rated their own (Table 5). For example, 83% of federal organization respondents believed they had a strong understanding of the process, whereas less than 50% of the program analysts surveyed agreed that this was true. There was a similar divergence of opinion between program and COE analysts within the Secretariat. Each group identified a need for more training for the other.
% Agreed | Program Analysts | COE Analysts | Federal Organization Respondents |
---|---|---|---|
I have a strong understanding of the Treasury Board submission process. | 83.3 n=60 |
n/a | n/a |
I have a strong understanding of the elements of Treasury Board submissions, policies, and processes. | n/a | 76.9 n=65 |
83.0 n=94 |
The federal organizations I work with have demonstrated, over the years, an increased understanding of the elements of Treasury Board submissions, policies, and processes. | 48.3 n=58 |
n/a | n/a |
Program analysts I work with have demonstrated, over the years, an increased understanding of the elements of Treasury Board submissions, policies, and processes. | n/a | 50.0 n=62 |
n/a |
Notwithstanding the above, it should be noted that among the 12 interviews conducted with federal organizations, all 4 federal organizations with high submission rates believed that their analysts had a strong understanding of the Treasury Board submission process. Results were mixed among federal organizations with moderate and low submission rates; they indicated that the level of understanding varied depending on the analyst assigned to them.
HR and survey data showed that approximately two-thirds of program analysts had been in their position for two years or less, which may at least partially explain the external perceptions about program analysts' understanding of the submission process.
Training. To assess the Secretariat's tools, support, and services, the evaluation team examined documentation on the boot camps held for program analysts and its guidance document on Treasury Board submissions, A Guide to Preparing Treasury Board Submissions. Furthermore, the evaluation team asked whether courses on the Treasury Board submission process were taken from the Canada School of Public Service.
The Secretariat offers program sector boot camps, which have received positive overall ratings and are considered useful in terms of providing a general overview of what program analysts should know. However, in the participant feedback forms that are completed following the boot camps28 and in interviews, Secretariat analysts indicated that the boot camps are not offered often enough and their duration (two days) does not allow for sufficient coverage of the Treasury Board submission process. As one boot camp participant noted, "could… have more hands-on training."
Evaluation participants also indicated that:
Relationship between Secretariat analysts and federal organizations. Survey results demonstrated that efforts are being made to foster a positive working relationship between program analysts and federal organizations. Almost all of the program analysts surveyed (91.6%) reported that they maintain regular contact with federal organizations regardless of whether a submission is currently being developed or processed. All of the program analysts interviewed stated that they have positive and productive relationships with federal organizations. It is worth noting, however, that when asked how they would characterize this relationship, Secretariat analysts did not refer to the three roles-enabler, challenger, and champion-defined in the Change Agenda.
Federal organizations, for their part, indicated that their relationships with program analysts were generally good or had improved. Nearly all federal organization respondents said they knew whom to consult at the Secretariat with respect to their submissions.
Appropriateness29 of Treasury Board submissions put forward. Secretariat analysts have a role to play at the pre-submission stage when federal organizations are considering proceeding with a submission. At the outset, federal organizations may want early feedback to confirm the directsion of their intended submission or to confirm that a submission is in fact necessary. According to the interviews, 58% of federal organizations seek pre-submission assistance, especially when the submission is complex or perceived as higher risk.30 Advice provided by the Secretariat at the pre-submission stage is intended (among other things) to help federal organizations make informed decisions about whether or not to put forth a submission. The assumption is that in the absence of such advice, federal organizations would proceed with a greater proportion of unnecessary submissions, creating a burden on the Secretariat and Treasury Board and reducing the efficiency of the submission process. The appropriateness of submissions was therefore regarded as an indicator of the suitability and quality of the services the Secretariat provides at the pre-submission stage.31
Secretariat analysts were asked about the appropriateness of submissions (Table 6 32).33 Approximately 40% of the program analysts surveyed responded that they received draft submissions at the pre-submission stage (for preliminary guidance) for initiatives that did not in fact require a submission. The views of COE analysts were similar.
Was the Secretariat consulted prior to sharing? | COE analysts | Program analysts |
---|---|---|
n = sample size | n=29 | n=24 |
Yes | 44.8 | 62.5 |
No | 10.3 | 25.0 |
Do not know | 44.8 | 12.5 |
Were any drafts submitted? | COE analysts | Program analysts |
---|---|---|
n = sample size | n=65 | n=58 |
Yes | 10.8 | 19.0 |
No | 36.9 | 55.2 |
Do not know | 52.3 | 25.9 |
Secretariat analysts were also asked how often federal organizations put forward draft submissions for formal consideration (as opposed to simply seeking informal feedback on a preliminary draft). At this stage, a much smaller proportion of analysts (19% of program analysts and 10.8% of COE analysts) felt that inappropriate drafts were submitted. The reported drop in unnecessary submissions between the stage at which the Secretariat is providing preliminary guidance and the stage at which a submission is formally put forward suggests that the Secretariat is having a positive impact on reducing the number of inappropriate submissions.
Accuracy, consistency, usefulness, and timeliness of Secretariat advice. Interviewees identified factors they believed had a positive or negative impact on the consistency, accuracy, and timeliness of the advice the Secretariat provides to federal organizations during the submission process.
Factors seen by interviewees as having a positive impact:
Factors seen by interviewees as having a negative impact:
Survey respondents were asked whether the submission review process ensures that Treasury Board submissions comply with government authorities and policies. A large majority of respondents from each of the groups agreed that it did (Table 7).
However, when survey respondents were asked whether the services provided by analysts enable federal organizations to put forth draft submissions that comply with Treasury Board authorities, policies and directions, there was a divergence of opinion. The federal organization representatives and program analysts had very similar agreement percentages to the ones for the above question and as shown in Table 7, whereas only 49.2% of COE analysts agreed. This divergence could be a function of program analysts acting as the single point of contact for federal organizations and COE analysts not always being consulted at the pre-submission stage.
Secretariat advice and guidance to federal organizations. Survey results (Table 8) show that the majority of survey respondents from federal organizations agreed that Secretariat analysts were providing consistent, accurate, useful, and timely advice. For those who disagreed, consistency and timeliness of advice had the lowest levels of agreement for both representatives of federal organizations and program analysts. Specifically:
It is important to note that the issues around timeliness and consistency of advice are recognized internally by the Secretariat, not only by federal organizations.
Consultations with senior management highlighted the value of the Secretariat's work with federal organizations during the pre-submission stage. However, the following areas were identified as needing improvement:
Adequacy of time for input into submission documents. The time required to complete the submission review process can vary greatly depending on a submission's characteristics. The Guide advises federal organizations to allow at least six weeks34 for the submission process and cautions that it is not unusual for the process to last more than six weeks. Survey results (Table 9) indicate that the process normally takes 8 to 10 weeks. This suggests that the Guide sets up unrealistic expectations, which could lead to a frustrating experience for someone who is new to the process.35
Secretariat analysts were asked for their views on whether they had enough time to review draft submissions properly. While not a majority, a significant number of program analysts and COE analysts (42.5% and 49.2%, respectively) said that they did not have enough time.
Participants in the ADM Working Session argued that the fast-track system used by the Secretariat to give priority handling to certain submissions is another factor affecting the time available to Secretariat analysts to review submissions.36 They said that it causes uncontrollable delays for submissions that are not fast-tracked. In discussions with Secretariat senior management, however, it was noted that there is no formal system for fast-tracking submissions. It is nevertheless true that certain submissions may receive priority treatment on an ad hoc basis due to urgent situations that require them to "jump the queue." It is at the discretion of the President of the Treasury Board to determine when a submission should receive priority handling; the decision is normally taken following negotiation with the minister responsible for the submission. Consultations between Secretariat officials and sponsoring federal organizations regarding time sensitivity of their proposals are held regularly; however, decisions regarding priority handling are not normally shared with the federal organizations whose submissions may be displaced as a result.
Extent to which Secretariat analysts' input is reflected in final Treasury Board submissions. Most Secretariat analysts stated that their advice is included in final submissions that go forward to Treasury Board. Some noted that, if their advice was not included, they might recommend that conditions be placed on the submission or a remark be included in the pr�cis. Survey results were similar, with 86.6% of program analysts and 71.2% of COE analysts responding that they felt their advice was reflected in the final submission.
This was confirmed with federal organization representatives, who stated that they include all or almost all of the comments provided by Secretariat analysts. Some noted that if they are in disagreement with the advice, they consult further to resolve the issue. A small minority (18.0%) of federal organization representatives stated that they include Secretariat comments due to time pressures and the perceived power of the Secretariat and not because they are in agreement with them.
Extent to which Treasury Board final decisions reflect Secretariat recommendations. Once the program analyst is satisfied that the submission is complete, he or she prepares a pr�cis that includes recommendations to Treasury Board. The recommendations are discussed and agreed to at the Secretariat's Strategy Committee, which is chaired by the Secretary and includes the participation of assistant secretaries from across the Secretariat. A presentation based on the pr�cis is then made to Treasury Board by the appropriate assistant secretary. Although the Secretariat has no authority over Treasury Board's decisions, the decisions normally reflect its recommendations. Data from the survey of program analysts suggest that Treasury Board decisions are consistent with the Secretariat's recommendations 82.2% of the time.
Information management. Once Treasury Board makes a decision, the Secretariat records the decision and formally advises the deputy minister of the sponsoring organization within fifteen calendar days. Although program analysts have no formal requirement to do so, they will normally advise the federal organization verbally of the Treasury Board decision as a matter of courtesy. This is usually done as soon as possible, generally the day after the Treasury Board meeting.37 While most federal organizations did not have an issue with the timeliness of the communication of Treasury Board decisions, it should be noted that one-third of federal organization representatives did not agree that Treasury Board decisions are communicated to them in a timely manner.
Of the program analysts and COE analysts interviewed, many indicated that they use their own private filing system for Treasury Board submissions. Most keep a hard copy of the decision for a period of time, after which the documents are archived or sent to the Treasury Board Submission Centre. Some analysts did mention using the Records, Documents and Information Management System (RDIMS),38 but most program analysts stated that RDIMS is difficult to use. Senior managers observed that a central repository for submission-related information would greatly facilitate their work.
The interviews and survey results also indicated that there is no formal system in use within the Secretariat for tracking conditions attached to Treasury Board submissions.
It should be noted that the Submission Tracking System (STS) maintained by the Treasury Board Submission Centre produces the agenda for Treasury Board meetings and indicates whether or not a submission has conditions attached to it. One interviewee observed that there are weekly requests to the Treasury Board Submission Centre from program analysts for background information on Treasury Board submissions, which indicates the potential value of a central repository of information on Treasury Board submissions. There is also the potential to use the system to track the implementation of conditions.
The Secretariat had developed the Expenditure Management Information System (EMIS) in part to ensure a single set of timely, complete, consistent, and accurate financial and non-financial data and a standard, automated, end-to-end expenditure management process. This was to allow for increased monitoring, tracing, and tracking of information, which would in turn support improved analysis and decision making. The expected result was to be more effective management of the government's supply and budget process.39
When the Budget Office Systems Renewal (BOSR) Project was launched (second phase of EMIS), features were developed to accommodate program sector needs with respect to Treasury Board submissions, including the tracking of conditions. These features, however, have not as yet been implemented. Program sectors maintain that these features did not meet their operational needs. Subsequent interviews suggested that this may be due in part to the fact that the Secretariat's program sectors have not fully analyzed and mapped their own business processes.
As indicated in the logic model developed by the evaluation team, the following assumption underlies the Secretariat's involvement in the submission process as it currently exists: awareness and education tools combined with sound advice to federal organizations should contribute to the quality of Treasury Board submissions and their quality should steadily improve over time.
Overall changes in quality of submissions. Program analysts were asked how the overall quality of Treasury Board submissions had changed during their time with the Secretariat (Table 10). While half of the respondents indicated that the quality had not changed, a little more than one-third of the program analysts (37.3%) stated that the quality of submissions had improved and about one in ten (11.9%) observed that their quality had declined. About half of the program analysts noticed no significant change. However, the usefulness of this data is limited, given that most of the Secretariat analysts surveyed had been in their job for no more than two years. Also, it should be noted that the evaluation did not review Treasury Board submissions and related pr�cis for quality.
Management Accountability Framework rating. The Management Accountability Framework (MAF) is structured around ten key elements that set out Treasury Board's expectations of senior public service managers for good public service management.40 Under one of these key elements, "Policy and Programs," the Secretariat assesses the "quality of analysis in Treasury Board submissions" according to the following criteria:
Federal organizations' MAF ratings were analyzed for MAF Rounds IV and V. MAF assessments indicate that the overall quality of analysis in Treasury Board submissions increased from Round IV to Round V. Although performance improved in relation to items 1, 2, and 4 cited above, performance with respect to the timeliness of consultations with the Secretariat actually decreased, which supports earlier findings on the adequacy of time available for input. Again, however, the usefulness of this data is limited, given that two years of results were available at the time of data collection and that the rating criteria were amended slightly for Round V.41
Secretariat analysts' views on submission quality. For the purposes of this evaluation, Secretariat analysts identified what they believed to be the key criteria of a high-quality Treasury Board submission. They cited the following:
As seen in Table 11, a significant number of Secretariat analysts, though not an overwhelming majority, indicated that submissions did not meet these criteria. This supports earlier findings on the divergence of opinion with respect to Secretariat analysts and federal organizations having a strong understanding of the submission process (see Section 6(b)(i)).
Average Rating (%) | Program Analysts | COE Analysts | Federal Organization Respondents |
---|---|---|---|
n = sample size | n=59 | n=56 | n=85 |
Appear to have completed internal42 consultations | 57.3 | 55.1 | 83.1 |
Were submitted with enough time for Secretariat analysts' review | 57.5 | 50.8 | 83.0 |
Followed the Guide to Preparing Treasury Board Submissions | 56.7 | 62.7 | 83.7 |
Risk and risk mitigation strategies identified | 59.5 | 52.5 | 76.4 |
Asked for the right authorities | 60.2 | 61.9 | 90.3 |
Had an appropriate level of justification | 54.7 | 55.9 | 89.8 |
Contained accurate information (e.g. accurate financial tables) | 54.2 | 54.3 | 90.8 |
Were written in a clear manner | 51.4 | 60.6 | 88.3 |
Does the Treasury Board submission process consume the minimum amount of resources required to achieve its expected outcomes?
As indicated in the Scope section, this evaluation was limited to the current submission process used to support Treasury Board ministers and did not assess this process against potential alternative mechanisms for providing this support. A limited review of international practices was conducted as part of the evaluation, however this review was not sufficiently in-depth to suggest alternative mechanisms for the Secretariat to explore. The question of economy therefore only considered the resources required to achieve the expected outcomes within the context of the current process. A comparison of alternative mechanisms or processes and the estimated cost of those processes was not undertaken.
As noted earlier in the report, the cost of the Treasury Board submission process within the Secretariat could not be established. Nevertheless, it is clear through the widespread level of effort and interest in submission-related activities that the process represents one of the key functions of the Secretariat. The evaluation can establish through interview evidence that Secretariat senior management sees the Treasury Board submission process as a core function. In line with this finding, the level of effort devoted to the function within the program sectors and Centres of Expertise is significant. There are approximately 135 program analysts within the Secretariat acting as the single window of advice to federal organizations and supporting Treasury Board ministers in their submission-related decision making. According to their survey responses (60 responses), program analysts devote, on average, 73% of their time to Treasury Board submission-related activities. Acting as that single window, program analysts in turn consult with COE analysts to provide coordinated advice to federal organizations and in support of advice to Treasury Board ministers. In the past 5 years, program analysts have consulted 181 COE analysts. According to their survey responses (66 responses), COE analysts estimate that, on average, 60% of their time is spent on Treasury Board submissions.43 A separate study examining the full cost of the submission process may well be warranted.
The proportion of more senior level analysts involved in the Treasury Board submission process has declined in recent years. In 2004, 60% of program sector ESs were ES-06s or ES-07s, compared to approximately 50% in 2008. While the proportion of ES-01s, 02s, and 03s involved in the process has remained relatively constant, the proportion of ES-04s and 05s has risen from approximately 32% to approximately 43% over the same period (as depicited in Figure 2). Note that these statistics do not take into account senior management efforts throughout the process.
Figure 2: Change in Proportion of Junior to Senior Staff
How analysts spend their time. Program analysts report spending on average 72.6% of their time working on activities related to Treasury Board submissions. As shown in Table 12,44 the reported time spent on pre-submission advice and consultation is relatively modest. This provides some corroboration of senior management's comments that inter- and intradepartmental communication is not adequate at the beginning of the submission process.
COE analysts report spending on average 60.3% of their time on activities related to Treasury Board submissions, with the majority devoted to reviewing and consulting on submissions. It should be noted, however, that program analysts are responsible for reviewing the entire submission document, while COE analysts normally review only specific sections.
A significant number of survey respondents believe that the submission process is "inefficient" (Table 13 45).
The following factors were specifically identified as having an impact on the efficiency of the process:
Further research into workload peaks would be beneficial, especially as they appear to occur on a regular and cyclical basis that aligns with the government expenditure cycle and reporting cycle. Also, it is an issue over which the Secretariat, as the administrative arm of Treasury Board, may have some influence.
Federal organizations provide material in their Treasury Board submissions that enable the Secretariat to perform risk assessments. Risks are discussed during the development of the Treasury Board submission, but the ratings are established once the submissions are complete. Risk ratings are included in the pr�cis, and they may have an impact on the amount of time the submission is allocated at the Treasury Board meeting.
Secretariat analysts and the staff of federal organizations commonly suggest that treating low-risk submissions differently would reduce the level of resources consumed by the submission process (Table 14). They generally argue that the risk rating should be established earlier in the submission review process and that the amount of time allocated and scrutiny given to a submission should correspond to the amount of risk the submission presents. As it is now, most submissions receive the same level of attention as they go through the submission review process, regardless of their level of risk or materiality.
Secretariat senior management expressed scepticism on this point, arguing that submissions that appear low risk at the beginning of the submission review process are sometimes perceived as high risk by the end of the process. As well, there may be risks that are of concern to Treasury Board ministers, which the Secretariat may or may not be familiar with. These may arise only when a particular submission is heard within the context of others on the Treasury Board agenda.
Though a large proportion of analysts (both at the Secretariat and in federal organizations) suggested that risk assessment be used to reduce the use of resources (see Table 14), Secretariat senior management clarified that risk assessment is intended to allow Treasury Board ministers to spend more time on proposals of higher risk and not to reduce workload. The results point to an opportunity to further explore how risk assessment can be used to enhance the efficiency of the process.
Other proposed measures for improving the efficiency of the current submission process that emerged during the evaluation's working sessions and interviews include the following:
In addition, evaluation participants proposed increasing the level of authority delegated to federal organizations as a measure for improving efficiency. Since this suggestion would require more fundamental changes to the mechanism used to support Treasury Board ministers, it is beyond the scope of the evaluation. However, it should be noted that the Secretariat, outside the scope of this evaluation, is currently conducting an "earned and risk-based delegation" project that is examining various approaches and options for potentially increasing the level of authority delegated to federal organizations.