THE CLASS GRANTS AND CONTRIBUTIONS PROGRAM

EVALUATION REPORT

FINAL REPORT
MAY 2013

EVALUATION DIRECTORATE

Table of Contents


Acronyms


List of acronyms
CGC Class Grants and Contributions
CFO Chief Financial Officer
DFO Fisheries and Oceans Canada
EOSS Ecosystems and Oceans Science Sector
EC Environment Canada
FMAS Financial Management Advisory Services
G&C Grants and Contributions
NRCan Natural Resources Canada
O&M Operating and Maintenance
PAA Program Alignment Architecture
TB Treasury Board

Executive Summary


Introduction

This evaluation presents the results of the evaluation of the “Class” Grants and Contributions Program (the Program). In accordance with the Treasury Board Policy on Evaluation (2009), the evaluation focused on the extent to which the Program demonstrates value for money by assessing the core issues of relevance and performance, including effectiveness, efficiency and economy.

This evaluation covered the period of 2009-2010 to 2012-2013 and was undertaken between January and May 2013 by Fisheries and Oceans Canada’s (DFO) Evaluation Directorate.

Program Profile

The “Class” Grants and Contributions Program is a transfer payment program.  It is intended to improve the capacity of DFO programs and recipients to share and obtain knowledge and improve capacity for understanding of fisheries and oceans issues. It was established in 1986 in order to reduce the number of individual low dollar, low risk Treasury Board Submissions.  DFO programs submit their applications to the Financial Management Advisory Services Branch.  Applications are assessed for compliance against the Program`s Terms and Conditions.  Following Deputy Minister or Ministerial approval, an agreement is entered into for either a grant or a contribution.   The Program received the following funding from 2009-2010 to 2012-2013:

Funding by Year
Fiscal Year Grants Contributions Total
2009/10 $152,500 $234,000 $386,500
2010/11 $152,500 $234,800 $387,300
2011/12 $152,500 $234,800 $387,300
2012/13 $152,500 $234,800 $387,300
Total $610,000 $938,400 1,548,400

A Sector, Region and/or Agency may also contribute additional funding from their own reference levels. Each successful request requires that the sponsoring responsibility centre manager provide Vote 1 Operating and Maintenance (O&M) funds in order to obtain Vote 10 Grants and Contributions funds.

Evaluation Methodology

The evaluation employed a program theory-driven evaluation science approach. A non-experimental design was used for this evaluation. The evaluation was calibrated to ensure an optimal level of effort to yield valid findings and conclusions. A review of documents and program files was conducted, as well as interviews of the Program’s personnel and its clients. There are only minor limitations none of which impact the interpretation of the findings and conclusions.

Evaluation Findings

Relevance

This evaluation concluded that there is a need for the Program. It is a cost-effective alternative to submitting individual Treasury Board Submissions.  In total 78 projects have been funded for a total of $2.4M over the last four years. Demand for the Program has declined in recent years mainly due to the recent economic downturn and resulting period of fiscal restraint.  The Program is a legislated government responsibility.  Projects align with the mandate and requirements of the Minister as laid out in the Department`s key pieces of legislation.  The Department is well positioned to deliver the Program given its knowledge of key stakeholders in the oceans and fisheries sectors and its capacity to bring these stakeholders together. Finally, the Program aligns with the Government of Canada`s priority of reducing workloads and encouraging administrative efficiencies. All projects aligned with Fisheries and Oceans strategic outcomes. 

Performance (Effectiveness, Efficiency and Economy)

Overall the Program is achieving its intended outcomes.  All projects focused on a broad array of knowledge that was relevant to the Department. Projects addressed one or more fisheries and/or oceans related issue.  Projects built worldwide capacity by providing access to domestic/foreign organizations and individuals from third world countries that would otherwise have been unable to participate.  In some cases projects contributed towards maintaining important infrastructures ensuring a continued flow of information from worldwide member states.  Individual projects can have significant reach.  Projects funded publications with thousands of subscribers and conferences and workshops that were well attended.  In short, the Program enables DFO programs to achieve their outcomes and the Program is seen as providing considerable support in doing so, given the limited resources.
 
Overall the program is efficiently producing outputs and operating in a way that minimizes the use of resources.  The Program is cost-effective.  Eliminating the need to develop individual TB Submissions for each grant and contribution saved thousands of person hours over the last four years. The Program is appropriately located with the Department’s organizational structure. 

Conclusion

The Program is relevant and performing as intended. No recommendations are required at this time to improve its performance.

1. Introduction


Context of the Evaluation
This evaluation report presents the results of the “Class” Grants and contributions Program (the “Program”). This evaluation is included in the Department’s 2013-2014 multi-year evaluation plan. The Program was previously evaluated in 2008-09. Section 42.1 of the Financial Administration Act requires all ongoing programs of grants and contributions to be evaluated once every five years. Treasury Board’s Policy on Evaluation (2009) requires all direct program spending to be evaluated every five years.

Scope

In accordance with the Treasury Board (TB) Policy on Evaluation (2009), the evaluation focused on the extent to which the Program demonstrates relevance and performance, including effectiveness, efficiency and economy.  This evaluation covered the period of 2009-2010 to 2012-2013 and was undertaken between January and May 2013 by staff from Fisheries and Oceans Canada’s (DFO) Evaluation Directorate.

2. Program Profile


2.1 Background

The “Class” Grants and Contributions (CGC) Program is a transfer payment program.  The Program has been in existence since 1986.  Prior to 1986, DFO regularly sought, through individual submissions, Treasury Board approval for a wide variety of relatively low dollar value and low risk grants and contributions.  The Program was established to consolidate these individual proposals into two concise programs. 

The Program is primarily associated with the Program Activity, Integrated Fisheries Resource Management which contributes directly to the departmental strategic outcome of Economically Prosperous Maritime Sectors and Fisheries in the Department of Fisheries and Oceans’ 2012-13 Program Alignment Architecture (PAA).  The Program may also support other programs in DFO’s PAA where other DFO program applicant objectives are clearly applicable.  Through the Program, DFO has been able to take advantage of opportunities as they arise and use the Program as a mechanism to advance its strategic objectives through the use of transfer payments.

2.2 Program Outcomes and Performance Measurement

The Program provides support to DFO programs for safe, healthy, productive waters and aquatic ecosystems, for the benefit of present and future generations. This is pursued through recipients who are associated with research, development, management, conservation, protection or promotion of fisheries and oceans initiatives and related issues which will further the mandate of the department.  The program is intended to improve the capacity of DFO programs and recipients of funding to share and obtain knowledge and improve capacity for understanding of fisheries and oceans issues. See logic model in Annex I.

The Program developed and implemented a Performance Measurement Strategy since it was last evaluated.  The Program has been monitoring its performance and produced annual reports for 2010/11 and 2011/12.

2.3 Program Activities

Program officers at both the regional and headquarters level receive applications from DFO programs. Assessment of applications is undertaken by the program officers and their respective Financial Officers with support from the Senior Financial Manager, Financial Management Advisory Services Branch (FMASB) of the Chief Financial Officer (CFO) in Headquarters.  Applications are assessed for compliance against the Program`s Terms and Conditions.  Subject to Deputy Minister or Ministerial approval, the Program enters into a grant or contribution agreement with the successful applicants. DFO programs carry out the terms agreed to in the grant or contribution agreement. DFO programs submit a final project report following the successful completion of their individual projects. The Program reviews final reports to ensure that all project deliverables complied with the DFO program`s intended project goals and objectives. 

2.4 Clients and Stakeholders

Stakeholders are those involved in or associated with research, development, management, conservation, protection or promotion of fisheries and oceans initiatives and related issues.

Recipients and/or beneficiaries include, but are not limited to:

  • The public;
  • Organizations -- Canadian and international, non-profit and private;
  • Aboriginal groups;
  • Recognized post-secondary institutions; and
  • Other levels of government or agencies.

2.5 Governance

The “Class” Grants and Contributions Program is managed within the office of the Chief Financial Officer. The Director General of Budget Planning and Financial Management is responsible for both programs.  Approval of regional and headquarter applications for grants and contributions are vetted by FMASB before being submitted for approval to the Deputy Minister for contributions or to the Minister for grants.

2.6 Program Resources/Budget

The Program, based on the Main Estimates for 2009 - 2010 to 2012 - 2013, received the following amounts for grants and contributions as reflected in Table 1.

Table 1: Main Estimates for Class Grants and Contributions Programs (2009 - 2010 to 2012 - 2013)

Table 1: Main Estimates for Class Grants and Contributions Programs (2009 - 2010 to 2012 - 2013)
Fiscal Year Grants Contributions Total
2009/10 $152,500 $234,000 $386,500
2010/11 $152,500 $234,800 $387,300
2011/12 $152,500 $234,800 $387,300
2012/13 $152,500 $234,800 $387,300
Total $610,000 $938,400 1,548,400

The Contributions component has an annual maximum of $4 M and the Grants component has an annual maximum of $1.5 M.  Each Sector, Region and/or Agency as part of their annual business planning process identify the amount of funding sourced from existing reference levels they will have available to address requests from external parties. Each successful request requires that the sponsoring responsibility centre manager provide Vote 1 Operating and Maintenance (O&M) funds in order to obtain Vote 10 Grants and Contributions funds.

3. Methodology


The evaluation employed a program theory-driven evaluation science approach such as that used by Rossi, Lipsey and Freeman in Evaluation: A Systematic Approach (2004), whereby the evaluation is guided by a conceptual framework (logic model or program theory) grounded in relevant substantive knowledge1. A non-experimental design was used for this evaluation. This model was chosen because the Program is offered and delivered to all DFO programs as well as for its low materiality.

A small scale evaluation method was applied in order to calibrate the most appropriate level of effort for this evaluation. To do so, evaluators assessed the risks associated with the evaluation and found the following:

  • A performance baseline was established when the Program was previously evaluated in 2008-09.  It was implemented as planned, was relevant, cost-effective and being a longstanding program, suitably achieved its long term outcomes.  All recommendations arising out of this evaluation were also suitably addressed;
  • Ongoing performance data was collected since 2010/11. The Program’s performance reporting indicated that the Program demonstrated suitable performance since the previous evaluation;
  • No important internal and external risks were identified.  No major changes with respect to the Programs internal factors (e.g. consistent inputs, activities, governance) or external factors (e.g. changes in policies or, regulatory requirements). The conditions that existed when the program was previously evaluated had remained constant;
  • The degree of the complexity of program delivery is very low.  It is essentially delivered from a single site by a small number of staff. Activities relative to the product of outputs are limited in number and are of an operational and procedural nature.  

On this basis of analysis the evaluation team concluded that the risks associated with the evaluation were low.  Following consultations with the Program, few specific areas of interest to them were identified that warranted further investigation. The evaluation team concluded that the level of effort for this evaluation could be reduced. Evaluators adjusted the breadth of the evaluation and limited the number of evaluation questions to directly respond to the Policy on Evaluation’s five core issues and to the information requirements of the Program.

To further calibrate the evaluation, evaluators implemented an iterative approach to data collection followed by frequent data analysis.  Program documents and performance information were first consulted, followed by interviews. Monitoring data collection and analysis on an ongoing basis enabled evaluators to assess the extent to which additional data collection efforts were required. Data collection ceased once evaluators judge that the evidence was sufficient to support the findings and conclusions.

  • A review of program documents, performance data and individual project files was conducted to assess performance. Evaluators examined all of the 2010/11 agreement project files as its sample since almost half of the projects are of an ongoing nature. They yielded a richness of information (e.g. project applications and reports) conducive to an in-depth analysis.  The 2010/11 files were selected in order to validate the Program’s 2010/11 annual performance measurement report contents. It also served as a basis of judgement regarding the reliability of the data in the 2011/12 performance report.
  • A comparative analysis with two other federal departments with similar programs was conducted to examine alternative delivery options and the cost-effectiveness of the Program.
  • Interviews with program personnel were conducted to gather insights about specific issues (management and staff N=2) and with stakeholders and beneficiaries (DFO Programs N=3 and external beneficiaries N=3). Following an analysis of responses, evaluators concluded sufficient data had been gathered.

There are only minor limitations none of which impact the interpretation of the findings and conclusions. The principal threat to the evaluation was relying on insufficient information from the previous evaluation, performance data and interviews to arrive at valid findings and conclusions. To mitigate this threat, evaluators used an iterative approach as previously explained, to data collection and frequent analysis. This ensured that the data collected provided sufficient evidence to support the findings and conclusions. See Annex I for more detailed information about the methodology employed to carry out this evaluation.

1 Donaldson, Stewart (2007). Program Theory-Driven Evaluation Science, Strategies and Applications

4. Major Findings


4.1 Relevance

This section of the report examined whether there is a continued need for the Program; whether the program is aligned with government of Canada and Fisheries and Oceans Canada priorities; and whether the Program is aligned with federal roles and responsibilities.


Key Finding: The program responds to a need, aligns with government of Canada and Fisheries and Oceans Canada priorities and is a federal government responsibility.


Need

The Program was originally created to consolidate individual proposals into a single Treasury Board Submission and avoid the preparation of multiple costly and time consuming submissions.  The 2009 Environment Canada and 2012 Natural Resources Canada “class” grants and contributions evaluations found that grants & contributions (G&C), as a cost-effective alternative to contracts, collaborative/partnership agreements, joint projects or in-kind support, provide programs with increased flexibility with respect to supporting the work of external parties.  They also found that G&C programs are an appropriate approach to support individuals, organizations and other levels of government in the pursuit of departmental priorities. All respondents interviewed for this evaluation agreed that the Program was a cost-effective, mutually beneficial means of acquiring and sharing information and knowledge with/from organizations.

With the exception of 2012-13, Table 2 below illustrates that there has been a relatively consistent level of demand for the Program.  In total 78 projects have been funded for a total of $2.4M over the last four years.   Almost all respondents stated that the Program is needed due to its cost-effective nature of facilitating access to a grant and/or contribution agreement with an external party. The decline in approved G&Cs can be attributed to the recent economic downturn and resulting period of fiscal restraint.  Evaluators concluded that there is a continued need for the Program.

Table 2 – Number and dollar value of grants and contribution agreements

Table 2 – Number and dollar value of grants and contribution agreements
($ ‘000s) 2009/10 2010/11 2011/12 2012/13 Total
  # Funds # Funds # Funds # Funds # Funds
Grants 18 $344 17 $334.5 9 $164 4 $79.5 48 $922
Contributions 11 $289 9 $441 6 $611 4 $151 30 $1 492
Total 29 $633 26 $775.5 15 $775.5 8 $230 78 $2 414

Government Priorities

The previous evaluation concluded that all projects were aligned with one of the three Departmental strategic outcomes.  Based on an examination of the 2010/11 and 2011/12 performance data, the current evaluation confirmed that there was consistency in the use of the Program funding mechanism by DFO program recipients to align with federal objectives such as reducing workloads and encouraging administrative efficiencies2 and departmental strategic outcomes.  All respondents agreed that there is a high degree of consistency between funded projects and the Department`s priorities and strategic outcomes.

Federal Role

The Department's work is guided by three key pieces of legislation:

  • The Oceans Act entrusts the Minister with leading integrated oceans management and providing coast guard and hydrographic services.
  • The Fisheries Act gives the Minister responsibility for the management of fisheries, habitat, and aquaculture.
  • The Species at Risk Act gives the Minister responsibilities associated with the management of aquatic species at risk.

The funded projects reviewed for this evaluation aligned with the mandate and requirements of the Minister as laid out in the aforementioned acts.  DFO is well positioned to deliver the Program given its knowledge of key stakeholders in the oceans and fisheries sectors and its capacity to bring these stakeholders together. The technical, scientific, and project management expertise of DFO staff is an asset for the effective selection, management and monitoring of the agreements.

The federal government and DFO’s roles are considered appropriate.  The Program supports DFO programs by providing assistance to selected organizations for designated activities that align with DFO’s key pieces of legislation.  The Program’s mandate falls under and aligns with federal government roles and responsibilities.

2 The Report of the Independent Blue Ribbon Panel on Grants and Contributions Programs (2006)

4.2 Effectiveness

This section of the report examined whether the Program is achieving its expected immediate, intermediate and longer term outcomes.


Key Finding:  The Program improves the capacity for recipients and stakeholders to share and obtain knowledge and for understanding fisheries and oceans issues. The Program provides support to DFO programs for safe, healthy productive waters and aquatic ecosystems.


The Program enabled DFO programs to take advantage of 78 opportunities over the last four years.  The Program funded a variety of organizations that included universities, international organizations, various associations (aquaculture, fisheries economists, and sealers co-op), science centres, councils, etc.  In 2010/11, 43% of the approved projects were domestic while 57% were international in nature.  The funding was used for events such as conferences, workshops, meetings and/or exhibits. Finally the grants and/or contributions supported ongoing and/or project type initiatives.  In 2010/11, 46% of the projects were Fisheries and Aquaculture related activities, 27% were for science related projects, and the remaining 27% included a Coast Guard and several Habitat/Oceans Management initiatives.

All projects focused on knowledge that was relevant to DFO.  For example in 2010/11, projects funded initiatives that included knowledge and skills development such as developing curriculum in Marine Safety and Environment Protection or workshops to improve capacity for the implementation of a fisheries treaty; an interactive science centre exhibits on whales; governance initiatives such as the management of the Arctic marine environment or strengthening international fisheries governance; science related knowledge such as blue fin tuna research or recording data temperature and salinity in ocean waters; fisheries and aquaculture economics; a feasibility study for a commercial grey seal hunt; and the sharing of best practices for rebuilding fisheries.  All respondents agreed that the types of knowledge addressed and generated by funded projects are directly related to DFO’s programs e.g. Aquatic Invasive species, Marine Navigation, etc.

Some projects enable DFO to learn about what types of research provinces and industry are involved in.  In some cases access has improved through steady growth in participation rates by interested stakeholders.  Also, the project funding is vital to enable access to more isolated individuals and/or organizations from third-world countries for example.  Funded projects can build worldwide capacity allowing for a better understanding of migratory patterns and stock sizes of certain species and ensuring their long term sustainability.

Project funding contributes towards building and maintaining an infrastructure.   For example certain initiatives would be compromised if they were to go unsupported by DFO. Projects can ensure continued access to and sharing of knowledge and information as well as leveraging of research from other countries.  

The evaluation found that all of the 2010/11 funded grants and contributions addressed a fisheries and/or oceans issue.  For example some of the projects addressed one or more fisheries related issues such as aquaculture, aquatic animal health, fisheries strategies and governance.  Other projects touched on one or more oceans related issues such as oceans management, environmental response and/or hydrographic products and services. In some cases, projects addressed both fisheries and oceans issues. Overall respondents felt that the funded projects considerably improved the understanding of fisheries and oceans issues for DFO and external stakeholders such as industry, other governments and/or international researchers. 

Projects create opportunities, making interactions possible amongst government and/or, academia, multiple disciplines and/or industry leading to an improved understanding of fisheries and oceans issues. For example, projects allow individuals from diverse backgrounds to interact, challenging them to think outside of their own disciplines. Interactions amongst oceanographers and meteorologists for example can lead to a broader understanding of climate change.  Interactions yield beneficial information sharing between participants. For example, projects support the sharing and acquiring of data from other countries. Better information is available for ocean forecasting, safety and security, environmental response, fisheries management, etc.

“… data from other parts of the ocean such as currents, salinity, and temperature can impact the weather (fog) or changes in ice movement which can impact the navigation of waters in Canada. But the conditions could not have been forecasted without this information that is being collected by other member countries”

The 2010/11 projects reached out to beneficiaries and stakeholders in varying numbers.  For example an Ontario Federation of Anglers workshop reached 200 Ministry of Natural Resources active field staff; a Nature Quebec workshop reached 186 participants from 78 governmental and non-governmental organizations; a science centre exhibit on whales reached over 40,000 visitors; a coastal publication reached a potential of 5500 worldwide subscribers; and participating in a fisheries economists conference hosting 156 participants from 11 countries.  Respondents cited improved reach as a capacity building factor.  In some cases projects allowed access to a broad spectrum of individuals.  For example, some conferences draw and attract large numbers of domestic and/or international participants and presenters from government, industry and academia e.g. Jacques Cousteau. 

Finally, all DFO project sponsors agreed that projects contributed towards their program’s immediate, intermediate and/or long term outcomes. DFO project coordinators agreed that the Program provides them with considerable support to achieve DFO outcomes.

4.3 Efficiency & Economy

This section of the report examined whether the Program`s activities are efficiently producing outputs and whether it is operating in a way that minimizes the use of resources to achieve its intended outcomes.


Key Finding: Overall, the Program is cost-effective. It frees up considerable human resources that can be more effectively used by the Department.  The Program is well managed and appropriately located within the Department’s organizational structure.


Alternative Approaches

The Environment Canada (EC) and Natural Resources Canada (NRCan) evaluations found that generally there are more advantages than disadvantages with the class approach to grants and contributions. Efficiency and administrative ease are the primary advantage as they reduce the need to develop new TB submissions and terms and conditions for every new grants and contributions (G&C) program. Any other alternative G&C would need to reduce administrative burden that is, avoid going to Treasury Board for every new G&C requirement.

The Program approved 78 grants and contribution agreements over the last four years.  This avoided the time required to prepare and develop the equivalent of 78 individual TB Submissions had this Program not been in place.  TB Submissions of this nature can require hundreds of person hours to develop and approve.  As a result, the Program allowed the Department to make more effective use of thousands of person hours over the last four years.

DFO allocates resources from its existing A-base funding for the administration of the Program, as none is provided via the TB submission.  Table 3 below illustrates that the overall cost to administer the Program was approximately $37.5K or equivalent to about 1.5% of all of the funding awarded to recipients over the last four years.

Table 3 – G&C Administrative Costs

Table 3 – G&C Administrative Costs
  2009/10 2010/11 2011/12 2012/13 Total
Number of G&Cs 29 26 15 8 78
Funding awarded 633K 775.5K 775.5K 230K 2 414K
Administration (FMAS Staff) 14K 12.5K 7.2K 3.8K 37.5K
As a Percentage of Funding 2.2% 1.6% 0.9% 1.7% 1.5%

Although respondents identified an issue regarding the amount of time required for the preparation of applications and the iterations required to complete them, they all agreed that preparing individual TB submissions, as an alternative, would be too time consuming.  Respondents did note that there could be more consistency and clarity of instructions for applicants and recipients regarding final project submission requirements.

Governance/Program Administration

NRCan has a “Class” G&C Program’s that is much larger and more complex than DFO’s Program.  Its program tends to be more broadly administered, with delegations of roles and responsibilities more broadly spread out. For example, the Director General, Financial Management Branch has the lead responsibility for establishing and maintaining the Treasury Board Authority for the “Class” G&C Program. However lead responsibility for the planning, managing and administering of the Program may be delegated by the assistant deputy ministers to director generals, or in the case of regions, to regional directors.

The DFO Program, being much smaller and less complex, is centrally managed and administered by a Senior Financial Manager within the Chief Financial Officer’s Financial Management Advisory Services Branch (FMASB). The evaluation found that there is no conflict of interest nor any government policy that would advise the Program should be managed and administered under a different sector at DFO.   The evaluation also found that the Program has developed and implemented a Performance Measurement Strategy, produces annual reports and monitors its performance for decision making purposes.

Best Practices/Lessons Learned

The most relevant best practices and lessons learned from the 2009 Environment Canada and 2012 Natural Resources Canada evaluations and from this evaluation are summarized below:

  • Providing managers with guidance materials (tools, templates) clearly communicates expectations ensuring consistent documentation is produced.  Training also provides additional opportunities to enhance clear understanding of procedures, roles and responsibilities as well as the value of the guidance materials.
  • A dedicated Program manager is important to providing consistent oversight of contribution agreements. It ensures consistency in agreements and tighter Program performance monitoring. It enables program applicants to focus more on project priorities.
  • Senior management oversight ensures that funded projects are consistent with departmental priorities.
  • The Program helps to maintain external relationships as they are important means to advance departmental priorities.
  • Regular two-way communication between the project managers and recipients enhance the success of a project. Managing beneficiary expectations is important and program coordinators should avoid making promises to them.
  • Plan well in advance when considering the use of the Program to allow for sufficient time to process application documents and for the required approvals.

5. Conclusions & Recommendations


5.1 Relevance

This evaluation concluded that the Program is relevant.  The Program is needed as it is a cost-effective means of exchanging information and knowledge between DFO and external parties.  In spite of a gradual decrease in the use of the Program, largely due to the recent period of fiscal restraint, there is a continued need for the Program.  The Program aligns with the Government of Canada priorities of achieving administrative efficiencies and with Fisheries and Oceans priorities and strategic outcomes.  Finally, the Program is a federal responsibility in that projects align with the mandate and requirements of the Minister.  DFO is well positioned to deliver this Program given its knowledge of key stakeholders in the oceans and fisheries sectors. 

5.2 Effectiveness

The evaluation concluded that overall the Program is achieving its intended outcomes.  The Program enabled a mix of 78 fisheries, aquaculture, science, Coast Guard, habitat and domestic or international related projects in the last four years. All of these projects focused on a broad array of knowledge that was relevant to DFO. Projects addressed one or more fisheries and/or oceans related issues. Projects enabled DFO programs to learn about the types of research provinces and industry is involved in. Projects build worldwide capacity by providing access to domestic/foreign organizations and individuals from third world countries that would otherwise be unable to participate. They make possible interactions amongst government and/or, academia, multiple disciplines and/or industry that ultimately lead to an improved understanding of fisheries and oceans issues. In some cases projects contribute towards maintaining important infrastructures ensuring a continued flow of information from worldwide member states.

Individual projects can have significant reach.  In some cases, projects fund conferences or enable the collection of data annually.  There are continuous interactions resulting in building communities that over time attract new members.  Over time projects of an ongoing nature maintain and increase capacity by building upon existing knowledge bases.   In short, the Program enables DFO programs to achieve their outcomes and the Program is seen as providing considerable support in doing so.

5.3 Efficiency & Economy

The evaluation found that the Program’s operational processes and governance structures efficiently produce outputs and economically achieve results. The Program is cost-effective.  Eliminating the need to develop individual TB Submissions for each grant and contribution saved thousands of person hours over the last four years. The cost to administer the Programs was equivalent to approximately 37.5K or 1.5% of funding awarded over the last four years. No issues were with its delivery were identified.

The Program is appropriately located with the Department’s organizational structure.  No areas of concern emerged regarding the administration of the Program within the Chief Financial Officer’s sector. The Program is taking full advantage of its Performance Measurement Strategy and using it to inform decision-making.

The most notable aspects of this Program are that it is well equipped with tools/templates and that it is centrally managed and administered. This helps to ensure that its terms and conditions are adhered to and delivered consistently from project to project and centrally managing it facilitates monitoring of program performance.

5.4 Conclusion

The Program is relevant and performing as intended. No recommendations are required at this time to improve its performance.

Annex I: Methodology


3.1 Project Management

The evaluation was led by a senior evaluation manager and supported by an evaluation officer within the Evaluation Directorate at DFO. In order to ensure the fairness and usefulness of the Evaluation Report, the evaluation team collaborated with Program personnel on preparing a list of documents to review, identifying interview respondents and reviewing and providing feedback on interview guides and various reports.

3.2 Evaluation Approach & Design

The evaluation employed a program theory-driven evaluation science approach such as that used by Rossi, Lipsey and Freeman in Evaluation: A Systematic Approach (2004), whereby the evaluation is guided by a conceptual framework (logic model or program theory) grounded in relevant substantive knowledge.  A non-experimental design was used for this evaluation, in which measurements are taken after the program has been implemented with no comparison group.  This model was chosen because the Program is offered and delivered to all DFO programs as well as for its low materiality.

A small scale evaluation method was applied in order to calibrate the most appropriate level of effort for this evaluation. To do so, evaluators assessed the risks associated with the evaluation and found the following:

  • A performance baseline was established when the Program was previously evaluated in 2008-09.  It was implemented as planned, was relevant, cost-effective and being a longstanding program, suitably achieved its long term outcomes.  All recommendations arising out of this evaluation were also suitably addressed;
  • Ongoing performance data was collected since 2010/11. The Program’s performance reporting passed a face validity test which indicated the Program demonstrated suitable performance since the previous evaluation;
  • No important internal and external risks were identified.  No major changes with respect to the Programs internal factors (e.g. consistent inputs, activities, governance) or external factors (e.g. changes in policies or, regulatory requirements). The conditions that existed when the program was previously evaluated had remained constant;
  • The degree of the complexity of program delivery is very low.  It is essentially delivered from a single site by a small number of staff. Activities relative to the product of outputs are limited in number and of an operational and procedural nature. 

On this basis of analysis the evaluation team concluded that the risks associated with the evaluation were low.  Following consultations with the Program, few specific areas of interest to them were identified that warranted further investigation. The evaluation team concluded that the level of effort for this evaluation could be reduced. Evaluators adjusted the breadth of the evaluation and limited the number of evaluation questions to directly respond to the Policy on Evaluation’s five core issues and to the information requirements of the Program. 

3.3 Logic Model

The evaluation drew on the following logic model for the purpose of defining the interrelationships between activities, outputs and outcomes and to define relevant evaluation questions.

Class Grants and Contribution Program Logic Model

image of logic model

3.4 Key Issues & Evaluation Questions

The evaluation matrix in Annex II presents the key issues and evaluation questions addressed, as well as the lines of evidence used for each question. The questions were established in keeping with Treasury Board’s Policy on Evaluation, a review of key program documents and an evaluation planning session with key program personnel.

3.5 Data Sources

3.5.1 Document & File Review

A review of program documents, performance data and individual project files was conducted to assess performance. Evaluators examined all of the 2010/11 agreement project files as its sample. They yielded a richness of information (e.g. project applications and reports) conducive to an in-depth analysis.  The 2010/11 files were selected in order to validate the Program’s 2010/11 annual performance measurement report contents. It also served as a basis of judgement regarding the reliability of the data in the 2011/12 performance report. Documents included but not limited to government-wide documents and departmental-level and program documents (Departmental Performance Reports, Reports on Plans and Priorities, etc.).

3.5.2 Comparative Analysis

A comparative analysis was conducted to examine where other federal government departments with similar programs locate their CGC Program within their organizational structure. The analysis assisted evaluators to determine whether the DFO Program is properly situated within DFO.

A comparative analysis between the number of hours required to prepare a single CGC TB Submission versus preparing individual TB Submissions for all of the DFO Program applicants was conducted. This allowed evaluators to assess whether the Program was a more cost-effective means of achieving individual DFO Program outcomes.  

3.5.3 Key Informant Interviews

Interviews of Program personnel and clients were conducted to gather insights about specific issues from those who work directly within the Program (management and staff) as well as DFO Programs/sectors and direct beneficiaries who have benefited from the Program. 

Program staff (N=2) at the NCR were interviewed based on their principal role in the delivery of the Program. DFO program recipients (N=3) and beneficiaries (N=3) were randomly selected from amongst the 2010-11 project applicants. Following an analysis of responses, evaluators concluded sufficient data had been gathered.
Separate interview guides were developed and tailored for each group. Interviews were conducted either by telephone or in person in French or in English. Interviews consisted primarily of open-ended questions.  Respondents were contacted in advance of the interview to schedule an appropriate time and received an interview guide in advance of the interview. While interview lengths varied by respondent, the majority of interviews were approximately 30 minutes.

3.6 Analytical Methods

The analytical methods used for this evaluation were tailored to the nature and availability of the data to be gathered, which were in turn linked to the evaluation questions. The data from each evaluation method described above was summarized to address each of the evaluation issues/questions contained in the evaluation matrix. The analysis strategy included the triangulation of multiple lines of evidence. The rigour of the data analysis relied on previous evaluation findings and conclusions as a baseline as well as the ongoing performance data which was gathered during the scope of the evaluation. This involved the extraction of the results from each line of inquiry that relate to each evaluation issue. The Evaluation Team then analyzed the data from each line of inquiry for each evaluation question to develop a summary response to each question, taking into account the strengths and limitations of each line of evidence. The table below describes the proportional and frequency terms used in the report to quantify the extent of agreement amongst respondents to specific questions and issues.

Proportion and frequency of terms used
Proportion Terms Frequency Terms Percentage range
All Always 100%
Almost all Almost always 80-99%
Many Often, usually 50-79%
Some Sometimes 20-49%
Few Seldom 10-19%
Almost None Almost never 1-9%
None Never 0%

3.7 Methodological Limitations, Challenges & Mitigation Strategies

There were minor methodological limitations and constraints related to the conduct of this evaluation. Below is a summary of limitations and what was done to mitigate their effect:

  1. Threat to internal validity – the most notable threat to internal validity in this evaluation is the extent to which causality or the attribution of immediate, intermediate and long-term outcomes can be directed to the program; particularly given the inability to establish a comparison group. To mitigate this situation, the multiple lines of evidence approach for this evaluation was used to help triangulate the impact of the program on immediate and intermediate outcomes.
  2. Inaccurate or incomplete findings and conclusions from the 2008-09 evaluation. – In order to ensure that the previous evaluation findings and conclusions were accurate, evaluators gathered a reasonable amount of data to complete a preliminary analysis to satisfactorily validate previous evaluation findings.  Finally, where there were no previous evaluations findings applicable to the current evaluation issues under review, evaluators gathered the necessary evidence to arrive at sound findings and conclusions.    
  3. Quality of Performance Measurement Data – The evaluation team elected to work with the Program’s 2010/11 Annual Performance Report as its primary data source for the evaluation.  In order to ensure accuracy of the data contained in the report, evaluators sought to validate it through an examination of the entirety of its associated project files. Evaluators determined the data was valid and could confidently draw data from any of the Program’s subsequent annual performance reports.

Annex II: Evaluation Matrix


Annex II: Evaluation Matrix
Issue/Question Indicator Data sources

Is there a continued need for the Program?

Evidence/demonstration that there is a continuing need for the Program

  • # of applications received for Gs & Cs by $ value/ years/grant or contribution/domestic or international.
  • changes in conditions that established the original need
  • Interviews
  • Program documents and project files
  • Previous and ongoing program performance information

To what extent is there alignment in the use of the CGC funding mechanism by DFO program recipients with federal and departmental outcomes?

Degree of consistency with:

  • Government of Canada objectives & priorities
  • DFO strategic outcomes and priorities
  • Program documents and project files
  • Federal government policy documents
  • Previous and ongoing program performance information

To what extent do the role and responsibilities for the delivery of the CGC align with those of the federal government?

  • Evidence/demonstration that DFO possesses the role and responsibilities through   federal legislative authority  (acts/regulations/policy) to carry out its responsibilities
  • Program documents and project files
  • Previous and ongoing program performance information

To what extent is there improved capacity for recipients and stakeholders to share and obtain knowledge?

  • Types of knowledge gained by program and grants and contributions recipients.
  • Perceptions from respondents that there is improved capacity to share and obtain knowledge.
  • Interviews
  • Program documents and project files
  • Previous and ongoing program performance information

To what extent is there improved capacity for understanding of fisheries and oceans issues?

  • Depth of knowledge gained
  • Alignment with fisheries and oceans issues   
  • Perceptions from respondents that there is improved capacity for understanding of fisheries and oceans issues
  • Interviews
  • Program documents and project files
  • Previous and ongoing program performance information

To what extent is support provided to DFO program recipients by the CGC Program for safe, healthy, productive waters and aquatic ecosystems?

  • Perceptions from respondents that the Program provides support to DFO program recipients for safe, healthy, productive waters and aquatic ecosystems”
  • Alignment between funded project outcomes and program recipient outcomes
  • Interviews
  • Program documents and project files
  • Previous and ongoing program performance information

To what extent is the program efficient and economical?

  • Challenges/barriers faced by applicants when using the Program
  • Compare alternative approaches e.g. individual TB submissions versus the TB submission for the Program tool; Best practices/Lessons learned from other departments.
  • Interviews
  • Program documents and project files
  • Comparative analysis
  • Previous and ongoing program performance information