Quick link to: EFIMAS home, WP4

EFIMAS Public (official) Website: http://www.efimas.org/

Work Package 5: Effectiveness of evaluation tools

Coordinator: Doug Wilson (IFM)

Reporting: Ditte Degnbol, Doug Wilson, Jenny Hatchard, J. Rasmus Nielsen, Barry Eustace, Katia Frangoudes, Troels Jacob Hegland, Ana Pitchon, Rikke Becker Jacobsen, Selina Marguerite Stead

Participants: EFIMAS WP5 List of Participants

Progress towards objectives

Objective

The objective of this work package is to produce a delivery framework for information to guide management decisions by

1) evaluating the technical performance and the effectiveness of the operational management evaluation tool as a means to inform decision making processes, and

2) developing a framework for the use of the management evaluation tool in decision making processes.

Developments in relation to objectives

Task 5.1 Technical tool evaluation

This task was carried out in case studies. In preparation of and at the Lisbon EFIMAS meeting in April 2007 a sub-committee of the Steering Committee was created to oversee and coordinate this process. This allowed some EFIMAS scientists to be involved in evaluations in case studies where they were not participants.

Deliverable 5.2 is reflecting task 5.1, i.e. the technical/parametric evaluation. This is partly covered under Deliverable 4.2) with respect to describing how to evaluate uncertainty, sensitivity, robustness, and to perform risk analysis within the evaluation framework. There will in relation to the WP3 and WP4 sections at the EFIMAS DocuWiki be made a general chapter describing how this is covered. A group was appointed to ascertain that this was done consisting of the WP3 and WP4 coordinators (Laurence Kell, Clara Ulrich-Rescan, Martin Pastoors, and J. Rasmus Nielsen).

Aspects of Task 5.2.1, Proof Reading, was carried out. It was implicitly included in the whole process of development and the structure of the evaluation framework and the project. This was a part of the cyclic feed back process of WP3 and WP4 concerning the different persons sitting together and developing. The interactions established did ascertain this. There is an internal evaluation of transparency of the technical tools here. These aspects should also be described thoroughly in the final reporting of EFIMAS.

Task 5.2 Process evaluation of the evaluation framework and the use of it and delivery process mechanisms

With respect to process evaluation of the evaluation framework and the use of it and delivery process mechanisms there has been held a long row of interviews and focus groups meetings during 2006 to fall 2008 with stakeholders. This research took place in five countries: Ireland, UK, Greece, Denmark and Spain. All of the focus group interviews in each country have been completed and the reports have been written. A summary chapter has also been written which will be the basis of a journal article. A list of focus group held representing different European fisheries systems are given in the EFIMAS List of Dissemination, Products and Activities.

With respect to the planned Stakeholder Workshops, EFIMAS re-conceptualizedthose and combined them into one broad and general. Fisheries Management Evaluation Frameworks in Action, the EFIMAS Conference in was held in Bruxelles in March 2008 with participation of 45 people from a broad set of stakeholder groups as well as EFIMAS scientists. A report of this conference has been prepared and is included as part of Deliverable 5.1 The reasons for moving the stakeholder workshops to a combined conference were fairly straightforward. The stakeholder feedback we had gotten in EFIMAS WP5 indicated that stakeholders see many different uses for our modelling and also have common questions to the use. The stakeholder feedback points toward needs for more and more general back and forth interactive approaches to advice. We have some experience already within the EFIMAS community with respect to this. It seemed clear after actual experiences from the four years EFIMAS work and stakeholder meetings and interviews under the project that the simple workshops demonstrating models to stakeholders would not have the same impact as actually solving real problems and discussing them in a more general context and in broader stakeholder context which is the aim under the suggested EFIMAS Conference.

The purpose of evaluating these modelling exercises in a more general context under a broader Conference context was also to showcase how EFIMAS has made concrete contributions to among other meet stakeholder needs. These exercises need to involve problems that stakeholders want solved and to address practical issues they have. The exercises as such typically take place under RAC working groups, during STECF meetings, during ICES meetings, or other meetings with a small group of stakeholders such as with managers at any level, or individual stakeholder groups such as fishers or NGO’s, etc..

The participatory modelling exercises working with various stakeholders to apply EFIMAS models to practical fisheries management problems were an important part of the conference conference where the stakeholder groups and the EFIMAS scientists who have been working with them will reported on their accomplishments and what they have learned through these exercises. A number of discussions have been held within the team about what kinds of discussions are going on among stakeholders and the kinds of questions that they will want the EFIMAS evaluation framework to answer. Feedback from stakeholder events has also been informally communicated to the framework development teams. These activities have identified harvest control rules and aspects of management strategies that stakeholders anticipate will be the most relevant to management decision making in the years after the EFIMAS model comes on line. The stakeholders involved have included DG Fisheries, scientists, environmentalists and fishing industry representatives.

Results

Process Evaluation: EFIMAS Focus Groups

The WP5 group has decided on priorities holding interviews and focus groups with stakeholders during 2006. There have were a number of preliminary feed back reports created by WP 5 and reported in early activity reports. There have also been many opportunities for scientists working n WP 3 and WP 4 to participate in the WP 5 focus groups. A time in each group was set aside for these scientists to ask their own questions to stakeholders. The focus groups are now complete and a full report on their results is part of Deliverable 5.2 posted below.

Task 5.2.1.1 Focus groups and manager interviews in Denmark

Focus groups were held in spring 2007 by IFM and DIFRES. The report has been completed and posted as part of the overall stakeholder interview report.

Task 5.2.1.2 Focus groups and manager interviews in Ireland

One focus group was held in Dublin and four in Killybegs in November of 2006 by IFM and MI. The report has been completed and posted as part of the overall stakeholder interview report.

Task 5.2.1.3 Focus groups and manager interviews in Greece

Focus groups and manager interviews were carried out by IFM and IMB in November 2006. The report has been completed and posted as part of the overall stakeholder interview report.

Task 5.2.1.4 Focus groups and manager interviews in Spain

Focus groups and manager interviews were carried out by IFM and AZTI in October and November 2006. The report has been completed and posted as part of the overall stakeholder interview report.

Task 5.2.1.5 Focus groups and manager interviews in the United Kingdom

Focus groups were carried out by IFM and the University of Newcastle in August. Manager interviews were carried out in February 2007. The report has been completed and posted as part of the overall stakeholder interview report.

Task 5.2.1.6 Manager and stakeholder interviews at the EU level

Eight manager interviews, including ones with senior scientists at ICES were carried out in 2007 and early 2008. These interviews were not analyzed separately but intergrated into the overall report which has been completed and posted as part of the overall stakeholder interview report.

Task 5.2.1.7 Manager and scientists interviews outside of Europe

Interviews were carried out in Canada and Iceland because of opportunities that for combined travel with other projects. Opportunities did not occur for more interviews, particularly the trip to Australia was not carried out. These interviews were not analyzed separately but intergrated into the overall report which has been completed and posted as part of the overall stakeholder interview report.

In relation to the focus groups and the Deliverable D5.1 there has been produced a report on the stakeholder focus groups and interviews which contains chapters from each of the five countries as well as a summary chapter. This report is the following:

Stakeholder Perspectives on Fisheries Science and Modelling: the Report of the EFIMAS WP 5 Stakeholder Research

Summary of focus group report

Process Evaluation: EFIMAS Conference

A EFIMAS Stakeholder Conference was held in Bruxelles 11-12 March in Bruxelles.

Fisheries Management Evaluation Frameworks in Action - Introduction to the EFIMAS conference

The results of the EFIMAS Conference in relation to Deliverable D5.1 are given in the following EFIMAS Conference Report:

Report for The Fisheries Management Evaluation Frameworks in Action Conference, Brussels 11-12 March, 2008

Furthermore, the presentations of the EFIMAS Conference is given here:

EFIMAS Conference Bruxelles, March 2008

EFIMAS Technical Evaluation

Fisheries management is basically about balancing conflicting objectives. Evaluating the trade-off between different objectives with different time horizons is therefore central to management decisions, and the established management evaluation framework should be able to inform the management decision process in this respect. However, informative comparison of outcomes for different objectives will require that the framework is able to analyse disparate and alternative types of outcome, and that results can be communicated.

The aim of the project is to guide fisheries managers and different stakeholders, including the catching sector, in their capability to make strategic choices. Hence, the project aim to provide outputs that enable to compare alternative options, e.g. in terms of stocks and economic returns for fleets and the industry, but it is not aimed at providing absolute performance measures and predictions for each option to address e.g. the questions: which strategy is likely to give better returns relative to objectives than another strategy? The management evaluation framework can answer “What if ? Questions” like if one fishery is reduced compared to another fishery or if the growth dynamics in a stock changes from a to b - what then? However, given the complexity of the ecosystems and fisheries systems evaluated the framework can not include all dynamics in the system, and as such the predictive power is limited. It is difficult to determine the prediction power of the framework and this will always be a case specific consideration. The output of the evaluation framework is in the best case of the nature: “It is likely that management regime A gives a better performance than management regime B” with respect to a selected “Measure of Performance.” It is in general necessary to be careful with using the models and tools in the evaluation framework as robust predictive tools given the complexity of the systems they are trying to comprehend, and the output from the models should not be used to absolutely quantify complex scenarios un-critically.

The objectives and contents of this technical evaluation report is to give an overview description of the technical facilities, capabilities and utilities of the technical evaluation facets in the framework including uncertainty and error assessment, sensitivity analysis, and risk assessment of the established fisheries management evaluation framework as well as its predictive power.

The technical evaluation is a part of the overall evaluation of the established management evaluation framework where the “Overall Evaluation” include both the “Process Evaluation” and the “Technical Evaluation.”

The “Process Evaluation” is described in other reports including the EFIMAS Conference Summary Report as well as the Stakeholder Focus Group and Interview Reports produced under the EFIMAS Project (WP5 in cyclic feed-back with WP3-WP4). The process evaluation focus on participatory management and evaluation through cyclic feed-back from stakeholders in relation to output and results produced by simulations and modelling under the management evaluation framework when running different scenarios of management as well as running different options for input to the scenario-evaluation which can be changed by the stakeholders.

The “Overall Evaluation” includes testing the general utility of the developed operational management evaluation framework through:

Iterative and cyclic process between WP3, WP4 and WP5 including feedback from regional workshops

Evaluation of the efficiency of the evaluation framework to capture changes in the fisheries systems

Applicability in other stocks / fisheries (general utility evaluation)

A part of the “Overall Evaluation” covers the technical evaluation, where there are implicit and build-in-facilities, capabilities and utilities of the established management evaluation framework(s) to perform the technical evaluation which among other include the following:

Technical Tool Evaluation

  • Evaluation of uncertainty and errors
  • Evaluations carried out on sensitivity, robustness, predictive power and limitations in use and set-up of the models and tools in the framework

- Sensitivity analyses

- Evaluation of robustness

- Evaluation of predictive power

- Evaluation of limitations in use and set-up

  • The codes of evaluation tools are proof-read and tested by alternative coding in critical cases
  • Real and simulated data sets are compiled representing a wide range of data properties, system characteristics and different hypothesis about the underlying processes
  • Sensitivity tests are performed and the robustness of the evaluation framework and descriptive models are evaluated by using them on basis of these diverse data sets

The technical evaluation include rigorous tests of the technical validity of the model implementa¬tion, sensitivity tests and the robustness in relation to data error and to assumptions about the resource system, the fisheries and management implementation. The technical evaluation also evaluates the utility in terms of the technical requirements for set-up and use of the evaluation framework which needs to be done on case specific basis. Examples of the technical evaluation are given on case specific basis and presented under the individual case studies (see EFIMAS DocuWiki as well as the EFIMAS WP4 Technical Reports)..

The Technical Evaluation is described in a special section under WP3 at the EFIMAS DocuWiki as well as under the section WP 5.1 at the DocuWiki in respect of this report, as well as in the Technical Reports associated to WP3 (e.g. Deliverable 3.4, Report of Final Software Package with Documentation). Consequently, this constitutes the general structure, content and ideology of the technical evaluation capability and facilities of the established generic management evaluation framework under the EFIMAS project. Furthermore, the technical evaluation is described in detail on case specific basis in the technical evaluation associated to each case study in Deliverable 4.2, 3.4 and at the EFIMAS DocuWiki under WP4. As such Deliverable 5.2 Technical Evaluation Reports is included in the above reporting and deliverables also. Deliverable 5.2 is reflecting task 5.1, The Technical/Parametric Evaluation, which is covered under Deliverable 3.3, 3.4 and 4.2 (by project month 48) with respect to describing how to evaluate uncertainty, sensitivity, robustness, and to perform risk analysis within the established management evaluation framework(s). Besides this summary EFIMAS has in cooperation with other projects produced additional technical reporting in two more detailed descriptions of the technical evaluation facilities of the established framework(s).

Thus, more details and facets of the technical evaluation facilities of the established framework(s) are comprehensively described in the present report:

  • EFIMAS 2008. Technical Evaluation Summary Report. EFIMAS EU SSP8-CT-2003-502516 Project Report, September 2008. DTU Aqua, DK.

Technical Evaluation Summary Report

associated with the two following reports:

  • Mosqueira, I. 2008. Validation, Verifacation and Testing of software and models in FLR. Report CEFAS, UK, September 2008: 6 pp.

FLR Validation, verification and testing

  • Sparre, P. J. 2008a. User’s Manual for the EXCEL Application “TEMAS” or “Evaluation Frame”. DTU-Aqua Report 190-08: 182 pp. ISBN 978-87-7481-077-3.

Users's Manual for TEMAS Evaluation Frame

as well as through the WP3 Technical Report and the case specific technical evaluation through case studies in the WP4 Technical Report by Month 48 (as well as at EFIMAS WP3 and WP4 DocuWiki pages).

EFIMAS Policy Implementation Plan

A EFIMAS Policy Implementation Plan has been produced in relation to Deliverable 5.3 and 1.7 of the project. This has been done in form of a public folder and Policy Brief distributing the Policy Implementation Plan for the EFIMAS Project.

The Policy Brief is given here:

EFIMAS Policy Implementation Plan and Policy Brief

Deliverables

Deliverable D5.1: Report(s) from evaluations from 4 regional stakeholder workshops

The final D5.1 consists of two documents. The first is a report of the final stakeholder conference. The second is a report on the stakeholder focus groups and interviews which contains chapters from each of the five countries as well as a summary chapter. The summary chapter of this report will be the basis of a peer-reviewed journal article.

Stakeholder Perspectives on Fisheries Science and Modelling: the Report of the EFIMAS WP 5 Stakeholder Research

Report for The Fisheries Management Evaluation Frameworks in Action Conference, Brussels 11-12 March, 2008

Deliverable D5.2: Technical evaluation report(s)

  • EFIMAS 2008. Technical Evaluation Summary Report. EFIMAS EU SSP8-CT-2003-502516 Project Report, September 2008. DTU Aqua, DK.

Technical Evaluation Summary Report

  • Mosqueira, I. 2008. Validation, Verifacation and Testing of software and models in FLR. Report CEFAS, UK, September 2008: 6 pp.

FLR Validation, verification and testing

  • Sparre, P. J. 2008a. User’s Manual for the EXCEL Application “TEMAS” or “Evaluation Frame”. DTU-Aqua Report 190-08: 182 pp. ISBN 978-87-7481-077-3.

Users's Manual for TEMAS Evaluation Frame

Deliverable D5.3: Evaluation process manual. A policy brief describing best practices in the use of quantitative evaluation tools in complex, multi-stakeholder policy environments

The final D5.3 consists of a policy brief which summarizes the basic outcomes and strategies of EFIMAS is a form that is highly accessible to both stakeholders and policy makers

EFIMAS Policy Implementation Plan and Policy Brief

Milestones

M6: Evaluations from 4 regional stakeholder workshops

This is completed and the report is above under Deliverable 5.1

List of EFIMAS Dissemination, Products and Activities (WP5)

Meeting Documents and Other Case Specific Work - working documents, models, analyses etc.

 
efimas1/wp5/main.txt · Last modified: 2009/01/14 15:56 by admin
 
Except where otherwise noted, content on this wiki is licensed under the following license:CC Attribution-Noncommercial-Share Alike 3.0 Unported
Recent changes RSS feed Donate Powered by PHP Valid XHTML 1.0 Valid CSS Driven by DokuWiki