Decision analysis and risk management 2013/Homework: Difference between revisions

From Opasnet
Jump to navigation Jump to search
No edit summary
Line 52: Line 52:
||[[User:Soroushm]] || [[User:Soroushm#Homework 1|HW1]] | OK || || |||| || || || || ||
||[[User:Soroushm]] || [[User:Soroushm#Homework 1|HW1]] | OK || || |||| || || || || ||
|----
|----
||[[User:Stefania]]|| [[User:Stefania#Homework 1|HW1]] | OK || || [[User:Stefania|HW3]] |||| || || || || |||
||[[User:Stefania]]|| [[User:Stefania#Homework 1|HW1]] | OK || || [[User:Stefania|HW3]] || [[User:Stefania#Homework 1|HW4]] || || || || || |||
|----
|----
||[[User talk:Thomasa]]|| [[User talk:Thomasa#Homework 1|HW1]] | OK || || [[User talk:Thomasa|HW3]] |||| || || || || ||
||[[User talk:Thomasa]]|| [[User talk:Thomasa#Homework 1|HW1]] | OK || || [[User talk:Thomasa|HW3]] |||| || || || || ||

Revision as of 11:49, 29 January 2013


Please read the homework assignments carefully and follow the instructions. If there is something unclear, please ask the course organizers (or fellow students) to explain and clarify! NOTE: Only in homework 2 you were asked to write your answers on this page. In all other homework assignments the answers are written somewhere else, often on your own user page.

Also add links to your homework answers in the table below. The evaluation of the homework exercises will be based on the answers found by following the links in the table. Students themselves are responsible for having the correct, complete and up-to-date links to homework answers. if you need help in adding the links to your homework answers to the table, please ask the course organizers (or fellow students) for advice. A convenient way to get help is to come to the exercise sessions.

Follow-up table

Homeworks
User HW 1: Mikko Pohjola's thesis HW 2: Basic concepts of open assessment HW 3: Draft of an assessment HW 4: Climate policy decisions and actions HW 5: Collaboration in climate policy assessment HW 6: Structure of pages and objects and R code HW 7: Structured discussion HW 8: ERF's for IEQ factors HW 9: Evaluation of assessment Seminar
User:Adedayo HW1 HW3 HW4
User:Adnank HW1 | OK
User:EmmaA HW1 | OK HW3
User:Isabell Rumrich HW1 | OK HW2 HW3 HW4
User:Johnagyemang HW1 | OK HW2 HW3 HW4 HW5
User:Joshuan HW1 | OK HW3
User:Juho Kutvonen HW1 | OK HW2 HW3 HW4 HW5 HW8
User:Jukka Hirvonen HW1 | OK HW2 HW3 HW4 HW5
User:Kasperi Juntunen HW1 | OK
User:Matthew HW1 | OK
User:Niklas HW1 | OK
User talk:Phatman HW1 | OK
User:Salla HW1 | OK HW3 HW4 HW5 HW8
User:Sam0911 HW1 | OK
User:Sami Rissanen HW1 HW3 HW4
User:Soroushm HW1 | OK
User:Stefania HW1 | OK HW3 HW4
User talk:Thomasa HW1 | OK HW3

Homework 1: Mikko Pohjola's thesis

Due date: 10 Jan

Read (or browse) Mikko's thesis (in heande, username and password needed) and provide brief answers to three (3) questions from the following question list. You are free to choose which questions to answer. Write your answers on your own Opasnet user page. Instructions on creating a user account and editing your own user page will be given on first lecture. In case of difficulties in wiki editing, write your answers on a separate document and copy them to your user page later. The questions and answers will be discussed on the second lecture (10 Jan). A sufficient length for each answers is a few sentences or bullet points. Please do not write lengthy essays, but instead try to identify and briefly describe the main points relevant in each question. The idea of this homework is not to find the right or correct answers, but instead to introduce the conceptual basis of this course to the students.

Questions:

  1. What is the main purpose of environmental health assessment?
  2. What is pragmatism?
  3. What are the main differences between regulatory and academic assessment approaches? Give examples of each.
  4. What are the main differences between traditional and novel assessment approaches? Give examples of each.
  5. What are the main differences between open assessment and most other assessment approaches?
  6. What is benefit-risk assessment?
  7. What is impact assessment?
  8. What different purposes are there for participation in assessment and/or decision making?
  9. What are the dimensions of openness?
  10. What relevant stakeholder roles are there in environmental health assessment and related decision making
  11. What is effectiveness' in the context of environmental health assessment and related decision making?
  12. What is the trialogical approach to knowledge creation and learning?
  13. What is decision support?
  14. What is a pragmatic knowledge service?
  15. What is collaboration?
  16. What are the properties of good assessment?
  17. What is the role of modelling in assessment and policy making?
  18. What parts does the framework for effective assessment and knowledge-based policy consist of?
  19. What does it mean that the results of assessments can be considered intentional artifacts?

Homework 2: Basic concepts of open assessment

Due date: 11 Jan

  • Task: Read the introductory pages listed below and write one question that you think needs clarification. The questions will be answered during the next lecture.
Materials and examples for training in Opasnet and open assessment
Help pages Wiki editingHow to edit wikipagesQuick reference for wiki editingDrawing graphsOpasnet policiesWatching pagesWriting formulaeWord to WikiWiki editing Advanced skills
Training assessment (examples of different objects) Training assessmentTraining exposureTraining health impactTraining costsClimate change policies and health in KuopioClimate change policies in Kuopio
Methods and concepts AssessmentVariableMethodQuestionAnswerRationaleAttributeDecisionResultObject-oriented programming in OpasnetUniversal objectStudyFormulaOpasnetBaseUtilsOpen assessmentPSSP
Terms with changed use ScopeDefinitionResultTool


Add your question below
  • Are the methods and solutions which are used for Kuopio climate change by 2020 sufficient to challenge against possible climate change and reduce them to the safe level and if not what other methods required to be taken into account because at the moment 13 coal sites still active in Finland?
  • What does the page type 'nugget' mean ?
  • Does a encyclopedia article explain the topic so that everyone can understand it or is it ment only to experts?
  • What is R code and how does it work?
  • If the thing which risk assessment deals with is so called hot topic and open assessment includes many participants how we handle situation so that discussion does not ramble? In otherwise, how we keep inrelevant information away
  • How effective is the supervising in Opasnet, for example if someone writes some wrong information in purpose and just want to harm the website, how long it takes that someone In Opasnet notice that?
  • Although the open assessment method was originally developed for providing solutions to the environment health problems, are there today any real applications on economic issues?
  • How is PSSP used in assessments? Are there examples of it in Opasnet that could help to understand the methodology better?
  • What is R code ? how and where is this R code written, programmed and execute ?
  • what does the validity of contributions made in trialogue depends on?
  • What is the role of the Nugget in Decision Analysis?
  • Is the structure and vocabulary of used in an assessment always like explained in the lecture and in Opasnet or do other institutions do it different?


----#: . Jouni, Marjo: when, where, how shall we provide answers to these questions? Perhaps within related method sessions and/or exercises? --Mikko Pohjola 08:12, 16 January 2013 (EET) (type: truth; paradigms: science: comment)

Homework 3: Draft of an assessment

Because homework 3 answers will be used as materials in homework 9, the deadline for homework 3 is Monday 4 February.

Due date: 21 jan

  • Task: With your pair, draft an assessment about the topic agreed on during the lecture. Write the draft assessment on either your or your partner's user page (and put a link to it on the other's user page). Copy the headings and explanations below to the page and use them as template. Choose your specific topic among these areas: a) Talvivaara mine or b) metal mining in general or c) climate change policies in cities.

Scope

Defines the purpose of the assessment: why is it done?

Question

A research question that the assessment attempts to answer.

Intended use and users

List of users that are supposed to need the assessment. Also, how do we expect them to use the information?

Participants

Who is needed to participate to make the assessment a well-balanced and well-informed work? Also, if specific reasons exists:

Scenarios

Decisions and decision options considered. Also, if scenarios (defined here as delibarate deviations from the truth) are used,

Analyses

What statistical or other analyses are needed to be able to produce results that are useful for making conclusions about the question?

Answer

Results

What are the results of the analysis?

Conclusion

What is the conclusion about the question based on the results obtained?

Rationale

Endpoints

  • What are the stakeholders that we should consider?
  • What are the endpoints that a stakeholder is interested in? How would the stakeholder summarise the endpoints to derive an overall preference ranking for outcomes of decision options? Think about this separately for each stakeholder.

Variables

  • What are the issues that should be looked at to be able to understand the outcomes of the decision options?
  • Typically, with health impact assessments:
    • What emissions and exposures should be considered?
    • What health endpoints should be considered?
    • What exposure-response functions should be considered?
    • What population subgroups should be considered?

Homework 4: Climate policy decisions and actions

Consider that you are given an assignment to assess the direct or indirect health impacts caused by a climate (adaptation) strategy or program. One of the first things in getting started with the assessment is to discuss, identify and explicate the decisions and options related to the assessment problem. In pairs choose one climate (adaptation) strategy/program from the material list below and identify and write out answers to the following questions based on the material. Use your own reasoning and knowledge or other sources (e.g. Google search) as complementary where the material is incomplete or inconclusive.

Write your answers on either group member's user page (other member adds a link to the answers on his/her user page). DO NOT WRITE YOUR ANSWERS ON THIS PAGE!

Questions:

  • What are the aims/goals of the strategy/program, i.e. what are the desired impacts and outcomes striven for?
    • Who are those that benefit if the aims/goals of the strategy/program are reached?
  • What are the actions that are needed/intended to take in order to progress towards the aims/goals?
    • Who are those that actually realize these actions?
  • What are the decisions that are needed to make in order to enable/promote the actions?
    • Who are the decision makers?
  • What direct or indirect health impacts, positive or negative, these decisions and actions (may) have?
    • Where and how do these impacts take place, who are those that face these health impacts in practice?The community,the citizens,
    • Are the health impacts big or small in relation to other impacts (e.g. economical, social, climate, other environmental, ...)?
    • Do the intended policies result in win-win, win-lose, lose-win, or lose-lose situations with regard to health and other impacts?
  • Formulate a plausible and meaningful specific assessment question that takes account of (some of) the aspects considered in above questions.
  • Extra question: In what ways your answers do or do not represent "shared understanding"? (The climate program/strategy can be considered a compilation of contributions by many experts and attempting to reflect the views and needs of different decision makers and stakeholders).it conforms to the decision,aims and strategies.

Materials:

Homework 5: Collaboration in climate policy assessment

This exercise continues from homework 4. With the same pair, using the same material, and building on your homework 4 answers, identify and write out your answers to the following questions. Narrow your scrutiny down to e.g. one or two decisions/actions/goals if needed. Base your answers on the climate program/strategy paper you have chosen, but also apply your own reasoning, other DARM 2013 course materials etc., particularly on the second set of questions.

Write your answers on either group member's user page (other member adds a link to the answers on his/her user page). DO NOT WRITE YOUR ANSWERS ON THIS PAGE!

Homework 5, part A: Questions about identifying roles and participation:

  • Who are the relevant participants of the assessment?
  • What roles the different participants (may) take in the assessment?
  • What kind of relevant knowledge they (may) have regarding the assessment?
  • What needs and aims do they represent in the assessment?

Homework 5, part B: Consider also the following questions about facilitating collaboration:

  • How could the relevant participants be involved in the assessment in an effective way?
  • How can the quality of an assessment be assured if anyone can participate?
  • How can you prevent malevolent contributions where the purpose is to vandalise the process?
  • How can you make the outcome converge to a conclusion, because all issues are uncertain and controversial?
  • How can you ensure that the outcomes are useful for the users?

Homework 5, part C: Prepare following tables from the climate programme of your selection. Instructions for table structures can be found at Training assessment.

  • Decisions table
  • Endpoints table

Homework 6: Structure of pages and objects and R code

The objective of this homework is that you learn in practice what different parts of a page are and how they are related to each other and to other pages. Especially, an objective is to understand the role of R code in this system. You should learn to identify key things from a code, but you are not expected to be able to write code or explain what it does in detail.

With your pair, select and reserve three pages (by adding your usernames beside the page link) from the list below. At least two of them have to contain t2b tables and R code. Go through the content by doing all of the key tasks below, if possible. Also look at the additional questions and answer at least some of them. Write your answers to the page by using the comment, defend (when things are OK), and attack (when things are not OK) buttons. If you can, improve the content or suggest tasks for improvement.

In addition, select three other pages from the list such that another pair has already done the work. Read the content and their comments, and agree or disagree with them. Try to improve the content further.

Key tasks
  • Check that the page has all subheadings that belong to the page type. Add, if missing.
  • Categorise the page to relevant categories.
  • Organise the content into the right subheadings. Especially, look what is data and what is result.
  • Check and update the Dependencies. Also check that the Answers in dependency pages are coherent with this page.
  • Make rcodes that a) creates the ovariable (under Calculations) and b) gets the latest ovariable and prints basic results (under Answer).
  • Test any existing code and report its functionalities on the page.
  • Write or update a summary (one paragraph in the very beginning explaining the main points of the text) on the page. If the content is too unclear to write a good summary, write down clarification questions to the moderator of that page.
  • If you have problems with any previous steps, describe them on the relevant point on the page.


Additional questions
  • Does the page have a correct page type?
  • Does the page have a question? Is it clear and unambiguous?
  • Does the page have an answer to the question? Does it actually give an answer to what is asked?
  • With variables, is the answer given as a link to a model run with calculated results? If yes,
    • Does the model run have a clear result table?
    • Does the model run have a clear result graph?
    • Is it clear where the code that was used to run the results is?
  • In method pages: based on the guidance in the answer, is it possible to actually use the method in an assessment?
  • In method pages: What data is required to be able to use the method? Are the requirements listed under "Inputs"?
  • Are there data on the page that is needed to answer the question? Are it in machine-readable format (i.e., in t2b table or directly stored in the database)?
    • Are the data under Rationale/Data subheading, (or in methods under Rationale/Inputs)?
  • Is there data or text that is NOT needed to justify the answer? Would that data be in better place on another page with a different question? What would that question be?
  • If the data is needed but is not used in the Answer, update it or suggest tasks to update it.
  • Are there external variables whose values need to be known to be able to estimate this object? If yes,
    • Are these listed under Rationale/Dependencies?
    • Are there equations (as text) for calculating this object based on the dependencies under Rationale/Formula (or Rationale/Calculations)
  • Is there an R code that implements the object?
    • With variables, is the code under Rationale/Calculations?
    • With methods, is the code under Answer?
    • If there are dependencies and formula, does the code take them in to produce an ovariable?
    • If there are data, does the code take them in to produce an ovariable?
    • When you run the code, does it crash (i.e. produce an error message) before completion? When and why (use show code and error message to understand what's going on)?
    • Are there several different codes on the page? Are their purposes clear?
    • Does the page use other pages (objects) in calculations? Are these connections listed explicitly as links under the R code?
  • Does the page have an evaluation (edistymisluokitus) in either a separate box in the beginning, or in the metadata box?
  • Does the page have other subheadings (See also, References, Related files, Keywords)?
    • Are there links to other related pages? Are relevant links missing?
  • Is the page categorised to relevant categories?
  • With encyclopedia pages: is the content detailed enough so that one or more variables or methods could be made based on it? Does such page(s) exist? Are these pages linked to each other?
  • Does the page explain its links to other pages? Is it clear how the page could be used as a part of an assessment?
  • Do you find other pages that actually have duplicate content? Is some content outdated (based on e.g. version history?)? Suggest how pages should be updated, deleted, or merged.
  • Do you find errors or mistakes on the page?
  • Is the text clear?
  • Write or update a summary (one paragraph in the very beginning explaining the main points of the text) on the page. If the content is too unclear to write a good summary, write down clarification questions to the moderator of that page.
  • Is the text properly referenced?
  • Are there discussions on the Talk page? If yes,
    • Have they been linked to from the main page?
    • Have the current resolutions been incorporated in the main page?
With R code
Without R code

Homework 7: Structured discussion

The objective of this homework is to teach how to organise existing written material into a structured discussion with a main statement and related arguments. In addition, students should learn to develop and use own arguments within a structured discussion.

The work is based on the Environmental Impact Assessment Directive, which is currently under revision.

Your task is to take the material listed above and initate and participate in structured discussions on page Talk:Environmental impact assessment directive according to the instructions on page Discussion. The original statements of the discussions are:

  • EIA directive works mostly very well.
  • The participation process required in the EIA directive is useless.
  • The current proposal does not leave enough flexibility to member states.
  • Accredited quality controllers will not improve the EIA process. On the contrary, they will reduce the transparency and thus possibilities to participate.

As facilitators, you should pay attention to get as many different opinions documented as possible. So, jump into a role of a stakeholder and try to think what he/she would say. Possible roles include:

  • A national authority giving environmental permissions.
  • A company applying for a permission for some activity and making an EIA about that.
  • A nature conservationist.
  • A local politician interested in both nature and local economy.
  • A citizen.

Note that you are allowed to:

  • Contradict your own arguments.
  • Update and improve statements if they are too vague or poorly written. However, be careful not to push the existing argumentation out of context. Instead of making large changes to a statement, start a new discussion with your new statement.
  • Add your signature to other people's arguments if you agree with them.
  • Clarify other people's arguments, if you do it carefully and do not change the meaning.
  • Copy arguments from one discussion to another, if they are relevant. But instead of copying large blocks, make references to the other discussion instead.

Homework 8: Scientific contributions: exposure-response functions

Page, where your contributions are to be added: Indoor environment quality (IEQ) factors

Instructions

  • Select one article from the list below
  • Every pair should have a different article. Write your username after the article you have selected
  • Go to Indoor environment quality (IEQ) factors
  • Write your username and whole reference (name, authors etc. of the article) into Rationale section. Use RefTaq functionality in the latter (an example exists in the Rationale).
  • Add one row into IEQ table and write again the reference in short form (e.g. Matthews et al. 2002) into Description/Reference –box.
  • Identify exposure, response, OR and other parameters from your selected article and fill the row you made with this information. However, you do not have to fill the slots "Exposure metric" and "Significance".
  • If OR and CI:s are given, write them in form OR (lower CI-upper CI), for example 1.8 (1.6-2.2)
  • Feel free to create more rows if your article has more than one exposure-response function
  • IMPORTANT: Into ”rationale” section below the IEQ table, write your estimate of the precision and plausibility of the OR. You can find an example e.g. on page Concentration-response to PM2.5
  • If two pairs get estimates for the same exposure-response function, you should consider how to combine them, i.e. what is the single common estimate for this specific exposure-response function.

Material

Homework 9: Evaluation of assessment

----#: . The work on homework 9 can begin on Tuesday 5 February, after everyone have completed their homework 3 exercises. --Mikko Pohjola 13:27, 29 January 2013 (EET) (type: truth; paradigms: science: comment)

----#: . This assignment description is a draft. The homework 9 assignment will still be slightly improved before Tuesday 5 February. --Mikko Pohjola 13:27, 29 January 2013 (EET) (type: truth; paradigms: science: comment)

In this exercise you are asked to look into and evaluate two (other than your own) homework 3 draft assessments. Find the assessments by the two users below you on the user/homework list on top of this page (the last on the list shall pick the first two users on the list and the second last on the list shall pick the last and the first user).

First characterize the drafts assessment according to the Knowledge-policy interaction and Dimensions of openness frameworks. The things to consider in the characterization are listed and explained in the tables A, B and C below.

Table A. Framework for characterizing the settings for health, safety and environmental assessments relevant to materials processing and related public policy.
Attribute Example categories Questions
Impacts
  • Environment
  • Health
  • Other (what?)
  • Which impacts are addressed in assessment?
  • Which impacts are most significant?
  • Which impacts are most relevant for the intended use?
Causes
  • Production
  • Consumption
  • Transport
  • Heating, Power production
  • Everyday life
  • Which causes of impacts are recognized in assessment?
  • Which causes of impacts are most significant?
  • Which causes of impacts are most relevant for the intended use?
Problem owner
  • Policy maker
  • Industry, Business
  • Expert
  • Consumer
  • Public
  • Who has the interest, responsibility and/or means to assess the issue?
  • Who is seen to actually conduct the assessment?
  • Who has the interest, responsibility and/or power to make decisions and take actions upon the issue?
  • Who are affected by the impacts?
Target
  • Policy maker
  • Industry, Business
  • Expert
  • Consumer
  • Public
  • Who are the intended users of assessment results?
  • Who needs the assessment results?
  • Who can make use of the assessment results?
Interaction (see tables B and C for advice)
  • Isolated
  • Informing
  • Participatory
  • Joint
  • Shared
  • How does assessment interact with the intended use of its results?
  • How does assessment interact with other actors in its context?
  • What is the degree of openness in assessment (and management)?

In order to identify the mode of interaction the assessment builds on, characterize the dimensions of openness in the assessment. See table B for dimensions of openness. Explanations of the modes of interaction are explained in table C.

Table B. Dimensions of openness.
Dimension Description
Scope of participation Who are allowed to participate in the process?
Access to information What information about the issue is made available to participants?
Timing of openness When are participants invited or allowed to participate?
Scope of contribution To which aspects of the issue are participants invited or allowed to contribute?
Impact of contribution How much are participant contributions allowed to have influence on the outcomes? In other words, how much weight is given to participant contributions?
Table C. Explanations of categories of interaction within the knowledge-policy interaction framework.
Category Explanation
Isolated Assessment and use of assessment results are strictly separated. Results are provided to intended use, but users and stakeholders shall not interfere with making of the assessment.
Informing Assessments are designed and conducted according to specified needs of intended use. Users and limited groups of stakeholders may have a minor role in providing information to assessment, but mainly serve as recipients of assessment results.
Participatory Broader inclusion of participants is emphasized. Participation is, however, treated as an add-on alongside the actual processes of assessment and/or use of assessment results.
Joint Involvement of and exchange of summary-level information among multiple actors in scoping, management, communication and follow-up of assessment. On the level of assessment practice, actions by different actors in different roles (assessor, manager, stakeholder) remain separate.
Shared Different actors involved in assessment retain their roles and responsibilities, but engage in open collaboration upon determining assessment questions to address and finding answers to them as well as implementing them in practice.

Second, evaluate the assessment drafts according to the (slightly modified) Properties of good assessment framework. Base your evaluation on the characterization you made. The things to consider in the evaluation are listed and explained in the table D below. For each attribute (i.e. a thing to consider) give a numerical evaluation on a 1-5 scale (1 = poor, 5 = excellent). Also write down your reasoning for each numerical evaluation. If something seems completely missing or not possible to evaluate the numerical evaluation can be 0 (also write down your reasoning why the particular aspect of the draft assessment deserves an evaluation of 0).

Table D. A slightly modified version of the properties of good assessment framework.
Category Description Questions
Quality of content Specificity, exactness and correctness of information. Correspondence between questions and answers. How exact and specific are the ideas in the assessment? How completely does the (expected) answer address the assessment question? Are all important aspects addressed? Is there something unnecessary?
Applicability Relevance: Correspondence between output and its intended use. How well does the assessment address the intended needs of the users? Is the assessment question good in relation to the purpose of the assessment?
Availability: Accessibility of the output to users in terms of e.g. time, location, extent of information, extent of users. Is the information provided by the assessment (or would it be) available when, where and to whom is needed?
Usability: Potential of the information in the output to generate understanding among its user(s) about the topic of assessment. Would the intended users be able to understand what the assessment is about? Would the assessment be useful for them.
Acceptability: Potential of the output being accepted by its users. Fundamentally a matter of its making and delivery, not its information content. Would the assessment (both its expected results and the way the assessment planned to be made) be acceptable to the intended users.
Efficiency Resource expenditure of producing the assessment output either in one assessment or in a series of assessments. How much effort would be needed for making the assessment? Would it be worth spending the effort, considering the expected results and their applicability for the intended users? Would the assessment results be useful also in some other use?

Write your characterizations and evaluations on your own user page (DO NOT WRITE YOUR ANSWERS ON THIS PAGE!').

Evaluation of assessments is not only something to be done after an assessment has been completed. Instead, evaluation should be seen as a means to guide the execution of assessments towards their aims while they are still happening. Therefore, the third task of this exercise is to formulate suggestions for developing/improving the draft assessment and write them down as comments/arguments to the corresponding user pages.

Because homework 3 answers will be used as materials in this exercise, the deadline for homework 3 will be changed to Monday 4 February. Correspondingly it is recommended that you attempt to do this exercise only after that date.

Links to some examples of using the above mentioned evaluation frameworks: