This page has been archived.
Information identified as archived on the Web is for reference, research or recordkeeping purposes. It has not been altered or updated after the date of archiving. Web pages that are archived on the Web are not subject to the Government of Canada Web Standards. As per the Communications Policy of the Government of Canada, you can request alternate formats on the "Contact Us" page.
Our strategic outcome is to contribute to well-managed and accountable government by conducting independent audits and studies that provide objective information, advice, and assurance to Parliament, government, and Canadians.
We measure and monitor our performance against our results chain (see Section IV—Supplementary Information). It links what we deliver—audits, studies, opinions, information, and advice—to our strategic outcome (long-term result).
The Office has established a set of core indicators of impact and measures of organizational performance to help inform management decision making.
Our indicators of impact help us to assess the extent to which
Our measures of organizational performance help us monitor the extent to which
In addition to measuring the Office's ongoing performance, we identified four priority areas for 2007–08. This year we received an increase of approximately $4 million in ongoing funding and an additional $2 million in one-time funding to address these priorities and other Office needs. The priorities are the following:
Approximately half of the $2 million in one-time funding received this year was spent on replacing our financial system. We successfully launched a new system on 1 April 2008 to coincide with the beginning of our fiscal year. We completed this project under budget by approximately 8 percent. The other half of the funding was earmarked to update our electronic data and records management system, which is a multi-year project.
We also have two significant long-term commitments:
Exhibits 4 and 5 provide a summary of the Office's most recent results.
Objectives and indicators |
2006–07 |
2007–08 |
2007–08 |
---|---|---|---|
Our work adds value for the key users of our reports | |||
Percentage of parliamentary committee members who find our performance audits add value |
92 |
97 |
75 |
Percentage of audit committee chairs who find our financial audits add value |
75 |
n/a1 |
752 |
Percentage of board chairs who find our special examinations add value |
87 |
75 |
75 |
Our work adds value for the organizations we audit |
|
|
|
Percentage of departmental senior managers who find our performance audits add value |
61 |
55 |
65 |
Percentage of Crown corporation and large department senior managers who find our financial audits add value |
66 |
n/a1 |
752 |
Percentage of Crown corporation chief executive officers who find our special examinations add value |
78 |
96 |
75 |
Key users of our reports are engaged in the audit process |
|
|
|
Number of parliamentary hearings and briefings we participate in |
66 |
33 |
No target established |
Percentage of performance audits reviewed by parliamentary committees |
63 |
56 |
No target established |
Key users of our reports and the organizations we audit respond to our findings |
|
|
|
Percentage of performance audit recommendations fully implemented four years after their publication |
46 |
553 |
50 |
Percentage of reservations that are addressed from one financial audit to the next |
100 |
04 |
100 |
Percentage of significant deficiencies that are addressed from one special examination to the next |
100 |
505 |
100 |
We gather information on the impact of our work by measuring a number of indicators that are external to the Office, and are therefore not entirely under our control.
To assess if our work adds value for the key users of our reports and the organizations we audit (the first two indicators of impact), we survey representatives of these two groups. Survey respondents are asked to rate, on a five-point scale ranging from "very poor" to "very good" or from "almost never" to "almost always," many aspects of our audits and our interactions with these users. We began to report the survey results under this methodology of performance assessment in our 2003–04 Performance Report.
While the response rates to our surveys are generally good, the actual number of respondents is quite small. Therefore, variances in results year over year should be interpreted with a degree of caution. Given the population size of respondents, even a small number of changes in responses may appear as a relatively significant change in the overall rating. (For details on the methodology used, see methodological endnotes 1 and 2 under Section IV—Supplementary Information.)
This year we undertook to review and renew our post-audit surveys for all audit products. The last time we conducted such a review was in 2003. Our objective was to ensure that the surveys were well focused on the most important aspects of how audits can add value and on the key elements of managing an audit professionally and efficiently. Results this year for performance audits and special examinations were obtained using our previous survey design. However, no post-audit surveys were conducted for our financial audits as a result of undertaking this review.
For this indicator, we survey the key users of our reports:
The results of our surveys for the specific items that we use to define the term "add value" are shown in Exhibits 6, 7, and 8.
Survey results for performance audits. In July 2008, we completed our second annual survey of parliamentarians. It related to our performance audits tabled in 2007–08.
We surveyed the members of the four key parliamentary committees that review our reports: the House of Commons Standing Committee on Public Accounts and the Standing Committee on Environment and Sustainable Development, as well as the Senate Standing Committee on National Finance and the Standing Committee on Energy, Environment and Natural Resources. Of the 46 members asked to respond to our survey, 24 responded (Exhibit 6).
Exhibit 6—Performance audits add value for parliamentary committee members |
---|
Our target is to have 75 percent of respondents provide us with ratings of "often" or "almost always." Virtually all parliamentarians who responded provided such ratings for our 2007–08 audits.
Survey results for financial audits. To determine the value of our financial audits, we have conducted two biennial surveys of the chairs of audit committees and other bodies with financial reporting oversight responsibility (Exhibit 7).
Exhibit 7—Financial audits add value for audit committee chairs |
---|
As noted earlier, we did not survey our financial audits in 2007–08. We will begin surveying our financial audits annually in 2008 using our new survey and will be reporting the results starting in our 2008–09 fiscal year. We revised our survey in part because we felt that some of the questions should be clearer and more focused. We have made changes to the wording and will be monitoring future results closely. The information presented in Exhibits 7 and 10 pertains to the fiscal years 2002–03 and 2004–05.
Survey results for users of special examinations. To determine the value of our special examinations to Crown corporations, we survey their board chairs (Exhibit 8). The number of special examinations we complete annually is small (eight in 2007–08 and six in 2006–07) as is the number of survey respondents (four in each of these years).
In our 2007–08 Report on Plans and Priorities we established a target of 75 for the percentage of respondents who would rate our performance as good or very good. This year, responses to three of the four questions are on or above our target. All four respondents rated us as good or very good at preparing reports that are clear and concise. Three rated us as good or very good at identifying good opportunities for improvement and at reporting our findings in an objective and fair manner. Only two rated us as good or very good at focusing attention on the most important issues within the scope of the examination.
The Office regularly surveys representatives of the organizations we audit to determine their assessment of the value of our work. We have identified three key representatives of the organizations we audit:
The items used to define the term "add value" are the same as those included in the surveys of report users. The results for the surveys of the organizations we audit are shown in Exhibits 9, 10, and 11.
Performance audit results. Since 2003–04, we have surveyed organizations subject to our performance audits after tabling the applicable report in Parliament. This year we received 52 completed surveys.
The target for performance audits, established in our 2007–08 Report on Plans and Priorities, was to have 65 percent of respondents rate our performance as good or very good in adding value for senior management. The most recent survey responses provided an average score of 55 percent. Ratings were below the target for all of the items of adding value.
Over the past five years, the assessments of department senior management have been lower than we would like to see (Exhibit 9). In the coming year we will be taking steps to understand the reasons for those assessments and developing any necessary responses.
Exhibit 9—Performance audits add value for senior management |
---|
Financial audit results. We have conducted two biennial surveys of the senior managers of Crown corporations and senior managers of large departments subject to a financial audit (Exhibit 10).
As noted earlier, we did not conduct a survey regarding our financial audits in 2007–08 as a result of our review and renewal of the surveys. The information presented in Exhibit 10 pertains to the fiscal years 2002–03 and 2004–05 only.
Special examination results. In our 2007–08 Report on Plans and Priorities, we established a target of 75 for the percentage of chief executive officers of Crown corporations that rated our performance as good or very good. The results have to be interpreted with caution, however: for the eight special examinations conducted in 2007–08, six survey responses were received, compared with two responses for the six special examinations in the previous year.
Responses to all four questions are on or above this 75 percent target (Exhibit 11). This year all respondents rated us as good or very good at focusing the examination on the most important issues, at identifying good opportunities for improvement, and at preparing reports that were clear and concise. Five respondents rated us as good or very good in reporting our findings in an objective and fair manner.
Exhibit 11—Special examinations add value for chief executive officers |
---|
For this indicator, we once again focus on the key users of our reports:
Involvement with parliamentary committees. While many parliamentary committees draw on our work, the Office's main relationship is with the Standing Committee on Public Accounts. Our appearances before committees assist parliamentarians in fulfilling their oversight role and give us the opportunity to increase awareness and understanding of the issues in our reports.
For performance audits, we monitor the level of involvement of parliamentary committees by tracking the number of audits reviewed by committees. We also assess the committees' level of interest in our reported findings by looking at how frequently they ask us to appear before them to further elaborate on our findings. It is important that the key users of our reports be engaged in the audit process, understand the nature and objectives of our work, and understand our reports and follow up on issues presented in them.
Parliamentary committee hearings also encourage departments and agencies to implement our recommendations. Following a hearing, the committee may report and make recommendations to the government. Departments and agencies are expected to report back to the committees on what they have done in response to these recommendations.
In 2007–08, we participated in 33 hearings and briefings (Exhibit 12): 14 with the Public Accounts Committee and 19 with other committees. This number is lower than last year's figure, which had been a record number of hearings and briefings for this Office, but is consistent with the average from previous years. One of our reports was the subject of many hearings, most of which we were not required to attend. Accordingly, we have not included them in this calculation.
Exhibit 12—We participate in parliamentary hearings and briefings |
---|
To determine coverage, we measure the percentage of our total audits in a year that are reviewed by a committee. Parliamentary committees reviewed 56 percent of our 2007–08 performance audits. This compares with 66 percent in 2006–07 and with 52 percent in 2005–06. (For further details, see methodological endnote 3 under Section IV.)
Committee hearings covered a wide range of topics and audit reports; for example, the NORAD system, the forensic laboratories of the RCMP, the Coast Guard fleet and marine navigational services, the Social Insurance Number, and international tax administration by the Canada Revenue Agency.
The Commissioner of the Environment and Sustainable Development usually appears before both the House of Commons Standing Committee on Environment and Sustainable Development and the Senate Standing Committee on Energy, the Environment and Natural Resources. Other committees will also call upon the Commissioner if they are studying matters audited by the Commissioner. This year, the interim Commissioner participated in a hearing on Bill C-474, an Act to require the development and implementation of a National Sustainable Development Strategy. Although it is not our common practice to participate in discussions surrounding new bills, we agreed to testify since the Act touched upon our Office's mandate. The interim Commissioner also participated in a hearing on adapting to the impacts of climate change and in other hearings where he discussed the findings of his 2007 and 2008 reports to Parliament.
Involvement with Crown corporation boards and other bodies. Throughout the financial audit process, we work closely with boards and audit committees that have oversight responsibility for financial reporting. We engage these committees in our audit work to help them fulfil their oversight responsibilities.
We brief them regularly on the progress of our work. The committees will normally review the audit plan, including the audit scope, strategy, and procedures. Discussions include how the plan addresses the corporation's significant risks, as well as other matters of interest that may have an impact on our work. In finalizing our audit report, we meet with the committees to discuss any significant findings and recommendations together with management's response and follow-up action.
We believe that the quality of our audit products greatly benefits from this open communication and active participation of audit committees and other bodies having oversight responsibility for financial reporting.
Involvement of boards of directors for special examinations. As with financial audits, we work closely with the boards of directors of Crown corporations and with their associated committees having oversight responsibility. We seek input from these committees in preparing our audit plans and solicit feedback from them as part of our post-examination process. We use the results of this feedback to assess our effectiveness and improve our practices.
For this indicator, the Office monitors the extent to which
The Office has limited control over the extent to which the above-noted items occur. Nonetheless, we track this information to the extent feasible and use it as input to certain internal management processes, such as the planning process.
Parliament considers the issues raised in our reports. We monitor how our performance audits help Parliament hold the government to account by identifying examples of how Parliament considers issues of accountability, performance, compliance with authorities, and the environment and sustainable development in its legislative and oversight work.
The following examples illustrate how our 2007–08 work has contributed to the legislative and oversight work of Parliament.
Hearings were held on our special examinations (Canadian Broadcasting Corporation [CBC/Radio-Canada], November 2005 report, and Atomic Energy of Canada Limited [AECL], September 2007 report) Background. Crown corporations form a significant part of the federal public sector. Federal Crown corporations employ about 90,000 people, manage more than $185 billion in assets, and have long-term liabilities of about $145 billion. These distinct legal entities, wholly owned by the government, are used to deliver important public programs. In special examinations, the Auditor General provides an opinion to the board of directors on the management of the Crown corporation as a whole. Federal Crown corporations are subject to a special examination at least once every five years. In the March 2004 Budget, the Government of Canada announced that it intended to introduce new corporate governance rules that would require Crown corporations to post special examination reports from the Auditor General on their websites. Since then, all 29 special examination reports the Office has issued to Crown corporations (that are still active) have been made public by the individual corporations. Unlike our performance audits of federal departments and agencies, special examinations have rarely been the subject of parliamentary hearings. The public release of many special examinations provides an opportunity for committees to hold Crown corporations accountable. Results. In the fiscal year 2007–08, two previously conducted special examinations were the focus of a parliamentary hearing. In May 2007, the Standing Committee on Canadian Heritage held a hearing on the 2005 CBC/Radio-Canada special examination. The Committee was conducting an extensive study on the role of a public broadcaster in the 21st century. The Committee issued a report in February 2008 in which it recommended that CBC/Radio-Canada report to the Committee, at its earliest convenience, on its progress on the recommendations made in the 2005 special examination carried out by the Office of the Auditor General. The Committee requested a response from the Corporation. During its January-February 2008 study of nuclear safety issues, the Standing Committee on Natural Resources asked us to appear for a hearing on our 2007 AECL special examination. In its testimony, the Office was able to provide details on unresolved strategic challenges faced by AECL, such as the completion and licensing of its medical isotope facilities, the development of a new generation of CANDU reactors, and the replacement of aging facilities at the Corporation's Chalk River Laboratories. |
Committees reviewed our chapter on Military Health Care—National Defence (October 2007 Report, Chapter 4) Background. During our audit, we found a number of weaknesses in the management of the military health care system. For example, National Defence (DND) lacked the information to know whether levels of service at its clinics were appropriate to medical and operational needs and whether the costs of providing them were reasonable even though the costs were rising. The audit also found that while the Department had developed a mental health care model based on best practices, the system was short of resources to meet the demand for mental health services. Results. This chapter was reviewed by two parliamentary committees. The Standing Committee on National Defence conducted an extensive study on health services provided to Canadian Forces personnel for which several witnesses appeared. The Auditor General and National Defence officials were called to appear before the Committee in March 2008 regarding our chapter. The Office provided further details on our findings relating to mental health care given specific Committee interests. DND agreed with our recommendations and developed an action plan to address the concerns raised in our chapter. The Auditor General proposed that progress reports from DND may be helpful to the Committee. The Standing Committee on Public Accounts also held a hearing on this chapter in January 2008, for which our Office and National Defence appeared. We provided details on our findings relating to the lack of health care information to monitor and measure performance, the need to better demonstrate the link between service delivery and the rising cost of military health care, and the need for better governance and oversight. The Department was able to outline the activities it is currently undertaking to improve health care to military men and women. Following the hearing, the Committee issued a report in which it recommended that, in order to hold the Department to account to fulfilling its commitments, it provide the Committee with a detailed progress report on the implementation of its plan to address deficiencies identified in our chapter. |
Organizations implement our performance audit recommendations. Departments and agencies are responsible for taking corrective action and improving their management practices. We have established that four years is a reasonable period of time to fully implement our recommendations. Annually, we request an update from these organizations on their progress in implementing our recommendations. This year we also asked them to assess their level of implementation. The information we receive is self-reported by the departments and agencies. While we do not subject it to any detailed review or audit, we do consider it for consistency with our current knowledge of the organization.
In 2003–04, we issued 230 recommendations. Since then, 34 recommendations have become obsolete, so we requested a total of 196 status assessments. We have received responses for 188, or 96 percent, of these requests for status assessments. Departments reported that they believed they have fully implemented 55 percent of the performance audit recommendations we tabled four years ago and have substantially implemented 29 percent (see methodological endnote 4 under Section IV—Supplementary Information).
These numbers represent a new baseline for this indicator due to a change in how the number is determined: departments now self-assess their progress. We believe this new approach is better aligned with departmental responsibilities to monitor and report on their responses to our recommendations as described in the Treasury Board Directive on Departmental Audit Committees. This directive states, in section 4.2.6.2, that the chief audit executive shall report periodically to the audit committee on whether management's action plans to address audit recommendations have been implemented and whether the actions taken have been effective.
Each year, we prepare a status report, which follows up on progress made by the government in responding to recommendations contained in previous performance audits. Status reports focus attention on significant recommendations and findings, thereby providing information to Parliament as it holds departments and agencies to account for actions taken, not taken, and planned.
Organizations address opinion reservations and significant deficiencies. For our financial audits and special examinations, we monitor the corrective action taken in response to opinion reservations and significant deficiencies contained in our reports. Our indicator is the percentage of reservations or significant deficiencies that are addressed from one report to the next. Our target is 100 percent.
This year's result for financial audit reservations is 0 percent. For our financial audits of federal organizations in 2006–07 and 2007–08, no reservations were issued. However, we issued two audit reports of territorial entities this year with reservations, and in both cases, there had also been a reservation in the preceding year's audit. We issued denials of opinion for all four audits. A denial of opinion is an expression by the auditor that no opinion can be provided because of significant limitations on the audit. In the case of one of the entities, the Northwest Territories Business Development and Investment Corporation, we issued our reports in 2007–08 on its last two fiscal years.
This year's result for addressing special examination significant deficiencies is 50 percent. For the eight special examinations reported this year, we had identified four significant deficiencies in the previous examinations, of which two were addressed.
A significant deficiency was reported in 2007 for Atomic Energy of Canada Limited (AECL) that was made up of three key areas. Two of these areas had been previously identified as significant deficiencies in our last special examination in 2002. While AECL had made progress on other significant deficiencies reported, these two continued to be strategic challenges that need to be resolved. They related to the completion and licensing of the Dedicated Isotope Facilities and securing long-term funding for the replacement of aging facilities at its Chalk River Laboratories.
We monitor department sustainable development strategies. In 1995, section 23(2)(a) was added to the Auditor General Act, directing us to monitor and report on the extent to which departments have met the objectives and implemented the plans set out in their sustainable development strategies (SDSs).
Over the years, we have monitored a number of departmental SDSs annually and continue to do so. In 2007, we also conducted an in-depth audit of selected strategies to determine whether they were encouraging departments to integrate the environment with economic and social considerations when developing policies and programs for the future and when managing programs and activities of the day. We found little evidence that SDSs were fulfilling this role, and we called on the government to conduct a thorough review of why the strategies are not working and what needs to be done to get them back on track. The government accepted our recommendation and the review is under way.
In March 2008, we reported an audit of the government's strategic environmental assessment (SEA) process, and found that it was not working either. The SEA process is designed to ensure that environmental effects are assessed and considered by ministers when new policies and programs are developed and approved. The Canadian Environmental Assessment Agency is leading a review to determine why SEAs are not working and what needs to be done to fix them.
SDSs and SEAs are two fundamental tools that the government has created to manage environmental and sustainable development issues. Both would benefit from an overarching government-wide sustainable development strategy or plan that would provide context and a sense of direction and purpose for departmental activities and programs.
We are hopeful that the reviews that are now underway of the SDS and SEA processes, and consideration by the government of an overarching sustainable development plan, will lead to significant strengthening of these important tools. This, in turn, will make the work that we are required to do under section 23(2)(a) of the Auditor General Act more relevant to both the government and Parliament than it is now.
We monitor environmental petitions. The 1995 amendments to the Auditor General Act require that we monitor and report annually to Parliament on environmental petitions received from Canadians. The petitions process allows Canadians to voice their concerns about environmental matters and to address questions to federal ministers and obtain responses. Twenty-eight federal departments are required by the Auditor General Act to respond to petitions.
In 2007–08, the Office received 52 environmental petitions. Ministers of federal departments are required to respond to petitioners within 120 days. They responded on time to 84 percent of the petition responses due in 2007–08 (Exhibit 13). While ministers are responsible for responding to petitions on time, we note that the decrease in on-time responses to petitions may have been influenced by a number of factors:
Canadians have been submitting petitions and ministers have been responding to them for 12 years. This year we looked at past experience to develop future options for strengthening the petitions process. We surveyed petitioners and federal departments, and we interviewed officials of the departments most often petitioned and of other organizations with similar citizen engagement processes. The results of this retrospective were reported to Parliament in our October 2007 Report.
In addition, we continued our practice of auditing selected issues and commitments made by ministers in their responses to petitions. In 2007–08, we audited departmental progress in responding to recommendations made in four previous petitions response audits. The results of this work were reported to Parliament in our Status Report in March 2008.
Through selected measures designed to evaluate organizational performance, we gather information on how efficiently and effectively the Office itself is functioning (Exhibits 14 and 15). The measures involve items for which the outcome is largely under the control of the Office.
On time. For performance audits, the Office determines when individual audit reports will be tabled in the House of Commons; thus, there are no statutory deadlines for these reports. However, we do communicate to the Public Accounts Committee our planned tabling schedule for performance audits for the coming fiscal year. In our 2007–08 Report on Plans and Priorities, 31 performance audits were listed as planned for tabling during the current fiscal year. For federal performance audits, 27 were tabled as planned and one was cancelled. For territorial performance audits, one of the three was tabled as planned, one was late, and one was tabled three months later than planned at the request of the territorial government. Details of the audits tabled are in Section IV—Supplementary Information.
All federal Crown corporation financial audits were completed on time, meeting our target of 100 percent, and 94 percent of audits of other federal organizations with a statutory deadline were completed on time (Exhibit 14). Completing audits of other federal organizations without a statutory deadline on time can be more challenging as these entities are not always ready to be audited within our self-determined deadline of 150 days after the period end. Nonetheless, in 2007–08, 81 percent of these audits were completed on time, less than our 2007–08 target of 100 percent. We have since revised our target to 70 percent for 2008–09.
Territorial financial audits present some unique challenges, including client readiness and a number of specific accounting and auditing issues. In 2007–08, 59 percent of these audits were completed on time, a slight improvement over 2006–07, though well below our target of 100 percent. We have revised our target to 55 percent for 2008–09 in recognition of territorial circumstances.
Seventy-five percent of our special examinations were completed on time in 2007–08, a significant improvement from 25 percent the previous year. Because of the previous year's results, the Office decided to plan to transmit completed reports six months before the statutory date. In addition, it developed a set of key principles to be applied in planning special examinations.
On budget. For all of our audits, being on budget is defined as completing the audit within 115 percent of the budgeted hours for the audit. This figure recognizes that factors outside the control of the audit team, such as client readiness and the number and complexity of audit issues identified, can affect time spent on an audit.
All of our on-budget results are well below our target of 70 percent (Exhibit 15).
While there are many reasons and possible explanations for why individual audits did not meet their budgets, overall the Office believes that these results mean that we can do a better job of planning, monitoring, and developing budgets for our audits. Our employee survey results also tell us that our staff believe we can do a better job of managing our allocation of staff to products.
Consequently, we have identified improving our overall project management, including better planning, budgeting, and resource allocation, as a strategic objective for the coming year. Some actions are already in place and others are being developed.
Our audit work is guided by a rigorous methodology and quality management frameworks. External and internal reviews, based on our frameworks, provide the Auditor General with opinions as to whether our audits are conducted in accordance with established standards of professional practice, and whether our quality control system is appropriately designed and effectively implemented and applied.
External reviews. In 1999, we hired an audit firm to assess our quality management system for annual financial audits. In 2003, an international team of legislative auditors carried out a peer review of the Office's quality management framework (QMF) for performance auditing. Both reviews found that our frameworks were suitably designed and operating effectively. The review of our QMF for performance auditing highlighted some good practices and made suggestions for improvement. Our action plan to address these suggestions has been completed and is available on our website under About Us. We have started planning for the next review of our quality management frameworks for all of our audit product lines and related services, and we intend to have the review carried out in 2009–10.
In addition, the provincial institutes of chartered accountants review our compliance with professional standards for financial audits about every four years and determine whether our training of chartered accounting students meets their requirements. The recent reviews concluded that we were following professional standards and met their requirements.
Internal practice reviews. We conduct practice reviews of our financial audits, special examinations, performance audits, and assessments of agency performance reports by assessing their quality and compliance with our quality management frameworks. The frameworks are based on Office policies and professional standards. The reviews assure the Auditor General of the quality of our audits and that they are being conducted according to our quality management frameworks. They also provide managers with suggestions for improvement.
In 2007–08, we completed two internal practice reviews of performance audits. The reviews concluded that the audits were conducted according to our quality management framework. Suggestions for improvement focused on documentation and the quality reviewer function.
We were not able to meet our objective for 2007–08 of performing about 10 practice reviews due to our need to reassign staff to other office priorities, staff vacancies, and the retirement of our Chief Audit Executive in early 2008. Though we were unable to complete the planned reviews, the Office is still compliant with CICA (Canadian Institute of Chartered Accountants) standards to conduct a practice review of each of our practitioners at least once every four years.
As they are completed, the results of our practice reviews are published on our website under About Us.
Internal audits. We also audit our management and administration practices. These audits assure the Auditor General that the Office is complying with government and Office policies. They also provide managers with assessments and recommendations.
In 2007–08, we conducted one internal audit on staffing and followed up on previous audit work. We found that the Office has appropriate staffing processes in place to ensure compliance with the Public Service Employment Act, other applicable legislation, and Office policies. We did find cases, however, where staffing activities should be better documented to demonstrate compliance with the Act. We found several opportunities for improvement in the internal and external staffing processes. We discussed them with the Assistant Auditor General of Corporate Services and with staff of the Human Resources Group. They agreed with the recommendations.
As they are completed, the results of our internal audits are published on our website under About Us.
The Office has set four objectives for providing a respectful workplace, each with its own indicators and targets:
Satisfied and engaged employees. Our target for this objective is to maintain a minimum level of 70 percent of employees being satisfied with their workplace. Our 2008 employee survey had a 93 percent response rate, which compares with the 90 percent rate in 2006 and is well above the 69 percent rate in 2004 and the 65 percent norm for most organizations. The results show that 88 percent of employees believe the Office is above average or one of the best places to work. This compares with overall employee satisfaction rates of 70 percent in 2004 and 82 percent in 2006 and with a 64 percent norm for most organizations (see methodological endnote 5 under Section IV—Supplementary Information).
Our challenge during the next year will be to maintain the high level of satisfaction and continue to look for ways to improve. In response to the 2006 employee survey, the Executive Committee committed to taking action in the following six areas: supervisory effectiveness, training, promotion process, career development, staffing, and effective communications. Implementation of the Corporate Action Plan began in December 2006. Our goal was to ensure that all the initiatives identified were well under way or completed before our 2008 employee survey. This goal was achieved and a final report presented to the Office's Executive Committee in the spring of 2008.
A bilingual workforce. The Office has improved its bilingual capacity in the management group, particularly for directors, with an increase from 59 percent in 2006–07 to 75 percent (our target) in 2007–08. (See methodological endnote 6 under Section IV—Supplementary Information.)
A representative workforce. The Office maintained its workforce at approximately the same level as the previous year, yet improved its relative levels of representation for all four designated groups. Three of the four designated groups were represented at levels greater than 100 percent of their workforce availability. For visible minorities, we have increased our level of representation to 83 percent of workforce availability.
Retention rate. Our retention rate of 86 percent for audit professionals has held steady in the past year and remains below our target of 90 percent. A retention and recruitment strategy has been developed and greater attention is being focused on specific target groups, especially in the accounting field, in order to increase our retention rate.
In addition to the positive performance for most of our respectful workplace indicators, we were pleased to be selected as one of Canada's top 100 employers and one of Canada's top 10 family-friendly employers for 2008—valuable recognitions that will enhance our recruitment efforts.
Sustainable development is the integration of environmental, economic, and social considerations in the development and implementation of government programs. Our 2007–2009 Sustainable Development Strategy was tabled in Parliament in December 2006 and is available on our website. It presents our plans to further integrate environmental considerations into our audit selection and planning decisions and our operational decision making. The targets we set and our progress to date are summarized in Exhibit 16.
The Office of the Auditor General has more than 50 years of experience in working with the international community in developing international accounting and auditing standards, building capabilities and professional capacities of national audit offices, sharing knowledge, and conducting audits of international organizations. These activities have helped improve the Office's own legislative audit practice, fostered the transfer of knowledge and skills between audit offices, and strengthened organizations in the United Nations system. Our international strategy guides our international activities while positioning the Office to meet future opportunities and challenges.
International accounting and auditing standards are influencing Canada's public and private sector standards and will soon be mandatory in Canada for private sector enterprises. Setting of accounting and auditing standards is shifting from the domestic to the international arena. The Office plays an active role in shaping these standards, particularly as they relate to the public sector.
The Office is a member of the International Organization of Supreme Audit Institutions (INTOSAI) and is a member of several of its committees, including the Professional Standards Committee. The Auditor General chaired its Subcommittee on Supreme Audit Institution Independence. In November 2007, the Code of Independence that it helped develop was approved by the International Congress of Supreme Audit Institutions as part of the International Standards of Supreme Audit Institutions. The subcommittee was dissolved after completing its work.
The Auditor General assumed the chair of the Professional Standards Subcommittee on Accounting and Reporting Standards in November 2007. The Office is also a member of the Financial Audit Guidelines Subcommittee supporting and actively contributing to the work of developing high-quality guidelines for financial audit that are globally accepted for the audit of financial statements in the public sector.
In January 2008, the Auditor General became a member of the International Public Sector Accounting Standards Board of the International Federation of Accountants.
In addition, employees in the Office participate in various task forces of the International Auditing and Assurance Standards Board to revise and develop International Standards on Auditing. This expert participation helps to build public sector considerations into these international standards.
The Auditor General chaired the INTOSAI Working Group on Environmental Auditing (WGEA) until November 2007 and continues to support WGEA activities by providing assistance to the Auditor General of Estonia, who now chairs the WGEA. The working group assists supreme audit institutions to better understand environmental issues as well as to build their capacity to conduct audits of their governments' environmental protection and sustainable development activities, by preparing guidance materials, training auditors, and facilitating knowledge sharing among members.
The International Legislative Audit Office Assistance Program for Improved Governance and Accountability of the CCAF-FCVI Inc. was established in 1980 to strengthen performance auditing in national audit offices. Funded by the Canadian International Development Agency, the program brings auditors from national audit offices to Canada for 10 months of training in performance auditing, accountability, and governance. Training is provided by our Office and that of the Vérificateur général du Québec. Since 1980, the program has trained more than 186 fellows from 51 developing countries.
The Office has recently completed its audit mandate of the United Nations Educational, Scientific and Cultural Organization (UNESCO) and the International Civil Aviation Organization. In early 2007, the Office was selected as the external auditor of the International Labour Organization effective in 2008.